Hacker News new | past | comments | ask | show | jobs | submit login
Cross-Site HTTP Requests Now Supported in Firefox (developer.mozilla.org)
65 points by shmichael on Jan 20, 2010 | hide | past | favorite | 18 comments



- Looking forward to using these new HTML-5ish features: storage, ___location, workers, accelerometer... It's going to be like web 2.0 all over again.

- The Mozilla Developer site has a lot of good documentation and examples: https://developer.mozilla.org/en/Firefox_3.6_for_developers


It's an interesting spec but it requires server-side changes.

I wish browsers would implement an 'application mode' which would require some kind of installation step and then open up more permissions to code from that ___domain.

Even if you get users to install Prism you can't make cross-___domain requests, which seems crazy, as I could get you to install some other binary and have full control. (Air does allow cross ___domain requests.)


Careful; that sort of wish lead to ActiveX and the disaster that is Korean online banking.


The problem with ActiveX is that it was more promiscuous with it's permissions than a first year college student on free-drink night :-)

Opening up some additional permissions (using something like the Android permissions model) would be a touch more restrained.


Except you can't, since none of that works in IE.


Well, yes. But as FF+Chrome+Safari have more than a third of the marketshare, you can start writing parts of a website that work better with HTML5. For example, an application that uses the application cache to be usable offline, and tell IE users they could have an offline experience by changing browser. This means that everything still works, but people with HTML5 browsers will have a better experience.


Sure you can.

Depends on your target audience. IE has vanishing small market share amongst several groups such as tech savy, non-MS developers, Apple fanboys, etc.

Also, if those capabilities lead to features that vastly surpass your competition it'll be easier to own majority of non-IE customers than it would be going after all customers with generic features.

You don't need every person with a web connection as a customer, you just need enough to be profitable/successful.


But if you cater to poor corporate suckers like me, IE still has vast majority of traffic. It's hard to justify spending any time on features they won't see.

In some industries, IE6 is still the plurality.


What industry would that be, time travel? :)

Here in the present, IE6 is around 2-3% of visitors.


As scary as this may be:

The financial services industry. I work on a large retail trading site - many of our customers and a majority of our clients still visit our site in IE6. They have no choice, and some of them do not even realize they have no choice.


My data shows that IE6 is about 20% of the traffic in the healthcare, biotech, and telecom industries (among others).

On the bright side, government & gov't contractors are only 10% IE6.


The UK defense industry is still standardized on IE6.


You do have a point. But if you're writing web stuff for the intranet or if you otherwise have enough control over your users to control their browser choice then it's a much better use of your resources to code for the future (FF, Safari, Chrome, Opera) than for the past. Big stupid corporations who still insist on writing all internal stuff to work with IE6 are, sadly, still around. But declining.



whoah, whoah.... I have really wanted to use cross-site http requests, but I'm having some trouble understanding certain decisions here.

Why are they using a new set of HTTP headers to describe scenarios that are already covered by HTTP response codes? Why does the client send an Origin header at all in the first place, when it can be inferred from the referer? Why does the server respond with a list of allowed origins, when it could simply send an HTTP response code to say allowed/not allowed/auth required/etc.

I'm probably missing something, but this just doesn't add up...

EDIT:

Oh - maybe because there isn't a good javascript interface to HTTP response codes? Well, it sounds like a client-side solution would be to build this interface, rather than making the server support some weird headers that will still rely on the client to faithfully perform access control.


I was just playing with this the other day. One interesting this now let's you do is cross-site basic http authentication if your server is configured to accept the new headers.

I'm talking to YOU, Twitter...


This is the problem I see with the cross-___domain requests; it puts the implementation change on the service provider rather than the app provider. If I was building an application using PHP I don't need to get twitter to change their server to allow me call their apis, so why is this different.


That's less of an issue for intentional APIs (where the API provider is already doing work for you) than it is for unintentional APIs (rss feeds, any reasonably clean page you can crawl with jQuery) where the provider will do nothing to help you.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: