Megosztás a következőn keresztül:


Securing Cross Site XMLHttpRequest

As I mentioned in my post on Cross Document Messaging, client side cross domain request is an important area of interest for AJAX developers looking for ways to avoid expensive server side proxying calls. While Cross Document Messaging is useful for allowing third party components or gadgets embedded in a page to communicate/converse using script on both sides, other cross domain scenarios like web services require access to cross domain content using network requests from a client side web application. For example, you may want to use your client side map based mashup to pinpoint Chinese restaurants for your current neighborhood. This could require the mashup to request a text file from Zagat.com with the locations of Zagat rated restaurants in the area which can then be superimposed on the map.

Along those lines, a few proposals and implementations exist like XDomainRequest in IE8, JSONRequest and the W3C’s Web Applications Working Group’s Cross Site XMLHttpRequest (CS-XHR) draft specification, which combines an Access control framework with XMLHttpRequest or other features. While XDomainRequest is focused on enabling anonymous access of third party public data, Cross Site XMLHttpRequest has added functionality and consequently enables a broader set of scenarios that may appeal to the developer who may choose to use cross domain authentication and access control among other features.  As can be expected with securing a large cross section of cross domain scenarios, a number of concerns have been identified with the CS-XHR draft by the web development community, the IE team members and members of the Web Apps Working Group. For a list of our recent feedback on security on CS-XHR and our take on important security principles in cross domain, please read our Security Whitepaper on Cross Domain. The paper also covers best practices and guidance for developers who will choose to build on the current draft if it’s supported by a future browser. Note that issues here are currently being discussed and some concerns may be mitigated as the draft evolves.

Meanwhile, your participation in the Web Apps Working Group can add a broader perspective and help raise further issues in the draft so that browser vendors like us can implement it in the future, so if you want to help, sign up with the Web Applications Working Group public alias!

For all those of you who would like cross domain public data and want it soon, there’s XDomainRequest in IE8. We’d love to hear feedback on XDR, and from projects that have been built using it. Hit the comments section with links or just email them to me. I’ll be blogging more about this feature in a few weeks!

Sunava Dutta
Program Manager

Comments

  • Anonymous
    June 23, 2008
    The comment has been removed

  • Anonymous
    June 23, 2008
    Microsoft won't implement insecure features. If the "community" wants to standardize on insecure features, IE users will be safe and non-standard.  Sounds good to me.

  • Anonymous
    June 23, 2008
    The comment has been removed

  • Anonymous
    June 23, 2008
    it's insane how much better Firefox is, it's like inexcusable that IE is soo terrible for developers... have you seen tools like firebug? it actually makes web development possible. please if there is any love in your heart model IE after firefox. or at least make some tols that rock like firebug.

  • Anonymous
    June 23, 2008
    Why didnt I hear about this before? XDR is pretty much what I need. Any chance this will be changed between now and the next beta? My enterprise for one could do without cross site XHR. That just cries 'security nightmare'.

  • Anonymous
    June 23, 2008
    The comment has been removed

  • Anonymous
    June 23, 2008
    The comment has been removed

  • Anonymous
    June 23, 2008
    Mogden-- The proposed CSXHR proposal is the one that is scary-dangerous.  Microsoft submitted XDR as a proposed standard; let's see if the other guys can keep their egos in check and put security first!

  • Anonymous
    June 23, 2008
    The comment has been removed

  • Anonymous
    June 23, 2008
    The comment has been removed

  • Anonymous
    June 23, 2008
    The comment has been removed

  • Anonymous
    June 24, 2008
    Embrace, extend and extinguish Bill might have left but it seems like the same s**t to me

  • Anonymous
    June 24, 2008
    Reading through the whole thread and adjacent ones, it appears that:

  • MS was originally taking an active part in setting up the ACLs; there were 3 MS employees 'dedicated' to working on that issue with the WG. Only one actually did, though, but was responsive and actively participated
  • there was a job change: the only MS employee working with the WG got a new job, and forgot to mention what exactly he was doing with the WG
  • whoever took up the slack decided that IE shouldn't bother with participating, and started on a wild tangent based on whatever already existed: enter XDR
  • after developing XDR, MS integrated it into IE8 and, forgetting where they came from and what they forgot to do, decided that their solution was right, everybody else was wrong, and voilà! Instant W3C WG uproar, just add markup.
  • to MS credit, those who made mistakes apologized for their attitude cavalière - but XDR remains. The sensible solution would thus be, scrap XDR (valuate it as an interesting experiment) and work with the WG to get a 'perfect' XHR v2 out; after all, wastin 6 months of development at MS doesn't compare with wasting 3 years of work at Mozilla, Apple, Opera, KDE and 2 years at MS. Mitch
  • Anonymous
    June 24, 2008
    XDR by any other name would be Botched? From Mitch's investigation and the other comments on this thread and the subsequent WG seem to point to the N.I.H. syndrome.  MS didn't come up with this, therefore they'll go build their own system. If there were serious concerns about the CS-XHR, then get them on the table, sort them out, fix them, and move on. I have no intention of writing another wrapper for CS-XHR's just to account for IE differences,... AGAIN! Please tell me that the IE8 Web Slices, and Activities do not "internally" depend on this botched XDR thing... if so, it is time for a quick re-write to get back on track. stan

  • Anonymous
    June 24, 2008
    Does all this negative feedback mean that you guys will reconsider your implementation? Or will you just pretend that you did not get any feedback?

  • Anonymous
    June 24, 2008
    The comment has been removed

  • Anonymous
    June 24, 2008
    The comment has been removed

  • Anonymous
    June 24, 2008
    @ted - I have read the white paper and W3C mailing list threads in detail.  I don't find the Microsoft objections very compelling.  It reads more like an after the fact justification for an earlier decision to go their own way.

  • Anonymous
    June 24, 2008
    "The sensible solution would thus be, scrap XDR (valuate it as an interesting experiment) and work with the WG to get a 'perfect' XHR v2 out; after all, wastin 6 months of development at MS doesn't compare with wasting 3 years of work at Mozilla, Apple, Opera, KDE and 2 years at MS. Mitch" Mitch, as I've mentioned in my draft, Mozilla hasnt even been able to secure XHR in the past. Slapping on cross domain there doesnt really sound wise. Other than that, I really wont attempt to answer your flaming rhetoric. From what I've observed this is classic Mitch and not worth my time. "I should add that you, Mr. Dutta, got a small reprimand from Chris Wilson for putting a bit too much editing on the MS XDR proposal and smackdown on XHR v2." This is hilarious. What's your point?

  • Anonymous
    June 24, 2008
    Mogden, here's what Jonas from Mozilla said about our concerns in my whitepaper btw and I do think there reasons are very valid. "Looking forward to continued discussion on these topics. There is definitely some interesting stuff in here so I'm glad we got this feedback!"

  • Anonymous
    June 24, 2008
    The comment has been removed

  • Anonymous
    June 24, 2008
    The comment has been removed

  • Anonymous
    June 24, 2008
    @fudnation Sounds like lots of MSFT trashing to me. You dont speak on behalf on web devs so dont try. I'm a web dev and I think XDR is something that should be a standard.

  • Anonymous
    June 24, 2008
    As always a very informative article.

  • Anonymous
    June 24, 2008
    Anyone else having problems posting comments on here?

  • Anonymous
    June 24, 2008
    Why isn't there more granular control over timeouts, like with ServerXMLHTTP (has four different timeouts controllable VIA setTimeouts method)? http://msdn.microsohttp://msdn.microsoft.com/en-us/library/ms760403(VS.85).aspxft.com/en-us/library/ms760403(VS.85).aspx And hopefully third time is a charm for posting this question... The first two tries from Firefox didn't appear to work.

  • Anonymous
    June 24, 2008
    @Glen: Why would you want that level of control?   @Mitch: XDR default is "allow none"-- the only safe setting.  If you didn't recognize that, you need to re-read the paper.   XDR looks a lot like JSONRequest from Doug Crockford at Yahoo.  It's a mistake to assume that Microsoft came up with this model in isolation.

  • Anonymous
    June 24, 2008
    @Glen: Why would you want that level of control?   @Mitch: XDR default is "allow none"-- the only safe setting.  If you didn't recognize that, you need to re-read the paper.   XDR looks a lot like JSONRequest from Doug Crockford at Yahoo.  It's a mistake to assume that Microsoft came up with this model in isolation.

  • Anonymous
    June 24, 2008
    The comment has been removed

  • Anonymous
    June 24, 2008
    mike sez <<Embrace, extend and extinguish>> Errr... except they didn't embrace or extend anything, there's no currently implemented competitive feature to "extinguish" and they proposed their work for a new standard that anyone can use.  So its hard to say this is a case of EEE.

  • Anonymous
    June 24, 2008
    The comment has been removed

  • Anonymous
    June 24, 2008
    The comment has been removed

  • Anonymous
    June 24, 2008
    The comment has been removed

  • Anonymous
    June 25, 2008
    The comment has been removed

  • Anonymous
    June 25, 2008
    The comment has been removed

  • Anonymous
    June 25, 2008
    The comment has been removed

  • Anonymous
    June 25, 2008
    Mitch, why put a bunch of branching in one object to make it function differently, when two objects would be a cleaner, more-efficient way of doing it?

  • Anonymous
    June 25, 2008
    The comment has been removed

  • Anonymous
    June 26, 2008
    FWIW, there are more sides to this story: http://annevankesteren.nl/2008/06/microsoft-feedback

  • Anonymous
    June 26, 2008
    Mitch-- Your sniffing claim is entirely bogus.   [1] HTTP Content-Negotiation seemed like a good idea at the time, but it's not really used.  In practice virtually no one builds web services that accept arbitrary data types on a single URL; people create different service URLs to accept different formats.  So the Content-Type header is redundant anyway. [2] There's exactly nothing from stopping you from uploading the content type in the body of the upload if you wanted to.  It's trivial.

  • Anonymous
    June 26, 2008
    The comment has been removed

  • Anonymous
    June 26, 2008
    The comment has been removed

  • Anonymous
    June 27, 2008
    The comment has been removed

  • Anonymous
    June 27, 2008
    The comment has been removed

  • Anonymous
    June 27, 2008
    @Ted-yes, I understand that Adobe's had to evolve their model as they rolled it out, and it's not ideal to have to patch holes. but no amount of hem'ing and haw'ing about the W3C proposal is going to prevent the rollouts of it (partial or complete) from suffering the same fate... it's the nature of the beast. by my real point was not just that the W3C could use a "similar" or the "spirit" of what Adobe did (and what Silvelight is now hopefully going to adopt or at least adapt), but that if this model is already rolled out, and it's already under gone a fair amount of battle (and is nonetheless worse for the wear, with some patches for the wounds it received), then perhaps it's a good place to start and see if it represents enough (in its current state) for a compromise between the two models. It would be worse for the W3C and Microsoft and Adobe to all have completely or partially incompatible approaches to the same problem, than it would be to have them all start hammering on the same solution. For instance, if the W3C and Microsoft were to say, ok, lots of sites already have out there the cross-domain policy files, we'll just make the native XHR object look for those, wouldn't it simplify this whole argument quite a bit? Even if as we went on, everyone (all 3 players, and all their constituents) pushed and agreed that the approach that the current generation of policy files has needs to be beefed up, it's at least a decent starting point, rather than a wholly different approach with custom headers or content-types or other more exotic handshake negotiations.

  • Anonymous
    June 27, 2008
    btw, after reading the W3C proposal draft for this XHRv2, I think there are some important differences to point out compared to the "spirit" of what Adobe has already done:

  1. Adobe has support for both file based and header based transport of authorization/ACL info. This makes it more flexible and accessible to developers who author in shared environments where they cannot control the web server response environment. To mitigate the risks that unintended policy injection cause, they have a "policy on policies" which gives final say-so to a root level server policy (file or header) of whether and which custom policy locations are allowed.
  2. By using file based authorization, combined with caching, the need for the "max age" as part of the policy is removed, and is instead put on the onus of the delivery/response environment (ie, with custom cache headers), which makes this part of the system more secure and less susceptible to either developer ignorance or malicious intent.
  3. Adobe has an on-demand system for loading policies, rather than the hybrid approach of sometimes inline with the response and sometimes in pre-flight. Basically, with Adobe, it's pretty much always pre-flight, although they do allow the policy to come in via headers PRIOR TO the request in question, as part of the payload of some other response, including for instance the actual SWF about to make the call.  So, the W3C rec could instead say that the policy could (or must) come in as part of the payload of the loading HTML file rather than requiring all the extra (but only sometimes) pre-flight requests and such.  The W3C approach seems needlessly complicated to me -- just my opinion.

rather than just dismissing my point and lamely saying that Adobe's just had a bunch of holes to patch, I'd love to see someone look at their current (patched) model and show how it doesn't represent a pretty good starting point (and simpler) as compared to the W3C or MS approaches. In other words, what attack vectors exist for proving why Adobe's model is insufficient?

  • Anonymous
    June 27, 2008
    The comment has been removed

  • Anonymous
    June 27, 2008
    And, as for the notion that the patched up Adobe model is now perfect, keep in mind that there's already an unpatched bug in the Adobe policy file model, due to their liberal behavior in allowing policy to load from anywhere on the server, and their improper canonicalization of url paths.

  • Anonymous
    June 27, 2008
    "The CS-XHR proposal was so unsafe that it was pulled out of FF3." It was turned off because the spec was changing and Firefox wasn't conforming anymore.

  • Anonymous
    June 28, 2008
    Dao-- exactly.  And guess why the spec kept changing?  You got it: security problems!

  • Anonymous
    June 28, 2008
    @Ted and Dao: yes, it was pulled from Firefox 3 because the spec wasn't final, because it wasn't ready, and wasn't safe enough. However, if Microsoft keeps championing XDR, which isn't perfect either, then we'll have a known insecure object in a browser that everybody will have to work around for a dozen years (see ActiveX for a precedent) - remember, "dn't break the web" also means "keep all the outdated cruft and bugs forever". The biggest part of the outrage at W3C came from the fact that MS stopped communicating and now pushes a solution with visibly no will to amend it, while if they had worked with the rest of the community, we'd currently have a cross domain object (be it CL+XHR or XDR, probably the former) which would be safe (and implemented in Firefox 3.1, Opera 10 and Safari 4). Mitch

  • Anonymous
    June 28, 2008
    Ted: I don't think anyone claimed that the spec was final at some point in the past, so this seems totally natural.

  • Anonymous
    June 28, 2008
    The comment has been removed

  • Anonymous
    June 28, 2008
    The comment has been removed

  • Anonymous
    June 29, 2008
    @Ted: what Dao said. I'll add that XDR relies on content sniffing. Every and all cases of content sniffing have been abused one way or another. The plain dumb system of control lists, much less so. The biggest problem with CL+XHR is setting up what must be checked, and how. This is where having several eyes and sample implementations (even on a different tack, like XDR) is interesting. But there's a huge gap between a test type and a final version. Mitch

  • Anonymous
    June 29, 2008
    Oh, mitch... must we continue? <<<I'll add that XDR relies on content sniffing>>> ...and I'll add that you don't know what you're talking about.  XDR doesn't "rely" on content-sniffing in any way.  you shouldn't try to talk about things you don't understand. <<< The plain dumb system of control lists, much less so >>> I won't even bother asking for your data, since I know you haven't got any.

  • Anonymous
    June 29, 2008
    The comment has been removed

  • Anonymous
    June 29, 2008
    The comment has been removed

  • Anonymous
    June 29, 2008
    The comment has been removed

  • Anonymous
    June 30, 2008
    The comment has been removed

  • Anonymous
    June 30, 2008
    An an earlier comment, Sunava referenced an OpenAjax Alliance discussion from March 2008 where we talked about W3C Access Control versus XDR, so I want to clarify. (I manage things at OpenAjax Alliance.) At the time, the consensus among the OpenAjax folks was that XDR provided a more secure basis than Access Control. Most of us liked XDR's conservatism, such as not transmitting cookies with cross-site requests. However, some months have passed now, and the Access Control effort has been evolving and all of the email discussion have helped to educate everyone involved. At this point, if asked again, I'm not sure what the OpenAjax members would say on the subject. Probably everyone would like to see what Access Control looks like after the W3C emerges from this week's meetings and then take time to study the security implications. Clearly the best case is if there is a single standard for cross-site requests from the W3C, and that standard addresses security concerns appropriately.

  • Anonymous
    July 30, 2008
    a {color : #0033CC;} a:link {color: #0033CC;} a:visited.local {color: #0033CC;} a:visited {color : #800080;}