[mdx] Feedback on https://datatracker.ietf.org/doc/draft-young-md-query/

Leif Johansson leifj at sunet.se
Mon Aug 26 12:03:29 PDT 2013


On 08/23/2013 04:00 PM, Cantor, Scott wrote:
> On 8/23/13 4:33 AM, "Leif Johansson" <leifj at sunet.se> wrote:
>>> -- 2.2 specifies that "all metadata retrieval requests MUST use
>>>    the GET method" -- would this preclude the normal use of HEAD
>>>    as an alternative to test for changed data, etc.?
>> Agree. Lets describe the use of the HEAD method for cache
>> validity checks.
> -1
>
> We don't need HEAD to do cache checks, and that just adds multiple ways of
> doing the same thing.
I think its reasonable to expect that http clients are going to do what
they usually do for cache checks - some do HEAD.
>
>>> -- 2.5 appears to just recapitulate status codes from RCC2616. If
>>>    so, rather than repeat them verbatim in this draft, I'd suggest
>>>    that a reference simply be made to the appropriate section of
>>>    the RFC.
>> Also good point.
> Disagree. We're defining what the codes mean in our context here.
why?
>> -- 3.1, Identifiers: Prohibits an identifier starting with a left
>>>    curly brace, but is silent about other potentially magical
>>>    characters such as plus signs... (I assume that's because 3.2.1
>>>    specifies that the identifier "must be properly URL encoded" but
>>>    that requirement should really be part of of 3.1, not 3.2.1,
>>>    shouldn't it?
> No, it's precluding the brace because it has special meaning. The encoding
> of the character should be irrelevant here. We're talking about the
> decoded value not being a brace.
check
>>> -- 3.1.1, Transforms: are the identifiers case sensitive? That is,
>>>    'md5' is mentioned as is 'sha1' -- is that equivalent to
>>>    'mD5' and 'Md5' and 'MD5', for example?
>> They should be case-insensitive.
> I'd prefer not. URLs are case sensitive, and I'd rather not introduce
> folding requirements.
I guess I can buy that argument, however doing case-folding is
not unreasonable in terms of work and it will make the protocol
more robust.
>>>    As a wanna-be crypto guy, I also have some concern about
>>>    a newish spec specifying deprecated hashes (e.g., MD5 should
>>>    really not be re the required "lowest common denominator"
>>>    required transform for this or any other purpose)
>> We should do a threat-analysis....
> We're not using the hashes for security here, so I doubt there's much
> concern.
Not sure I agree - we're using hashes to lookup security-critical
objects. We need to be sure of this if we decide to keep md5
>>> -- 3.2.1, Reqest: How long can a request be? Hypothetically, what
>>>    if I have a laundry list of a thousand IDs, all transformed into
>>>    SHA1 hashes? That could make for a long request!
>> ... or sha256
> There really is no clean way to address URL length that I've ever seen.
>
>>>    In 3.2.1, are the curly braces around the IDs literal curly braces
>>>    or is that just a semantic representation issue?
>> cf note about needing to discuss {} above
> It's a specification of the actual character, after any decoding.
>
>>> -- 5.1, Integrity: Seeing that an integrity check is only RECOMMENDED
>>>    rather than REQUIRED bothers me, a lot. I think this is a fundamental
>>>    mistake. Content integrity checks should be MANDATORY IMHO.
>> agree
> The reason it wasn't required is for protocols like OpenID and OAuth. We
> were trying to be general and not limit this to SAML and our use cases.
Funny you should say that - I've heard rumours that certain large
deployers of OAuth may be coming around to the realization that
keeping lots of bearer-token-state around is expensive, and could
somebody please figure out a way to use signed JSON as a substitute...
funny how things work out.

        Cheers Leif
> -- Scott
>
>





More information about the mdx mailing list