Hacker News new | past | comments | ask | show | jobs | submit | more sehrope's comments login

An interesting solution I've heard[1] to reform both copyright (and possibly patent) law is to require compounding annual fees to maintain them. For example if someone writes a new song or book, we could have a short "free" copyright period. After that, creator (or more accurately, the rights holder) would need to pay an annual fee to maintain their copyright. The fee would increase by some percentage each year.

If the value of the work is greater than the fee, then it'd be in the interests of the rights holder to pay the fee and maintain the copyright, for example to be able to license it out to others. If it's not, then the copyright would expire and the work would enter the public ___domain.

A high enough interest rate for the compounding would ensure that everything is eventually in the public ___domain. For works that are repeatedly extended, say Mickey Mouse, the public will receive the fees collected for each of the extensions until it enters the public ___domain.

This also has the interesting effect of incentivizing the creation of new works as there wouldn't be any fees associated with them for the initial term.

There's obviously a lot more details to handle such as having a central registry, inflation adjustments, differentiating between new works and derived works, and how to handle works created prior to the introduction of a system like this. Still I think it's an interesting proposal.

[1]: I don't remember where exactly but probably here on HN.


I liked this concept, but you need to establish what the "unit" of the work is. If I'm a photographer, do I need to pay the compounding fee on each picture I take? I take thousands a year. While George Lucas makes a single motion picture over a year.


The idea that something will enter the public ___domain once its market price has been established is nice, but my problem with this scheme is that it needs so much fine-tuning. How should the prices increase? Exponentially, adjusted for inflation? Will the prices be much too low for much too long, or will they rocket up at unreasonable rates for some kinds of media?

I think the best scheme for works entering the public ___domain is modelled on eminent ___domain. That is, a work may be "seized" (and its owner duly compensated) by a government when the utility to the public of it entering the public ___domain is great enough.

Obviously there would need to be a lot of structure around it, because it's going to happen all the time. There might have to be a registry of copyright owners. There might also be taxes or regular charges levied as in your system.


> Will the prices be much too low for much too long, or will they rocket up at unreasonable rates for some kinds of media?

It would actually make sense to have both, it would essentially softcap copyright durations.


That was Rick Falkvinge. It was probably either on his blog or posted to torrentfreak


Sounds like a good idea


> $500/month for a service like this?

$500/mo sounds like peanuts for what a service like this provides. For a business that would use this type of information (e.g. TV shows like Daily Show or Colbert or a political campaign) it's both worth more and cheaper than doing it in house. The power bill alone for all those TV-tuners/DVRs would be more than that!

I'm surprised that it's not more expensive.


I'm fairly certain this is what the daily show uses:

http://www.snapstream.com/

... or at least something similar. In other words, they do it themselves. That's the only way I can explain their uncanny ability to find a dozen clips containing a certain word or phrase across very long time frames.


A tangent, but I thought I remembered The Colbert Report staff developing something like this in-house. After a bit of searching, it turns out that staff members had actually developed and spun-off a segment writing and producing tool [1].

[1] http://www.businessweek.com/articles/2014-06-17/stephen-colb...


Assembly is a piano wire, you're gonna get your hands dirty and nobody will hear about it.


> Chattanooga has the largest high-speed internet service in the US, offering customers access to speeds of 1 gigabit per second – about 50 times faster than the US average. The service, provided by municipally owned EPB, has sparked a tech boom in the city and attracted international attention. EPB is now petitioning the FCC to expand its territory. Comcast and others have previously sued unsuccessfully to stop EPB’s fibre optic roll out.

I did a quick search to see pricing for EPB's internet service and wow is it cheap[1]. The two plans listed on the site are 100 Mbps for $57.99/mo or 1Gpbs $69.99/mo. Oh and both plans are symmetric so you get that for upload as well.

I don't know of any city where the local cable co offers anything like that, let alone at those price points. No wonder they want to block this through legislation; competing in the market would not be pleasant for them.

[1]: https://epbfi.com/internet/


No monthly data caps either according to these reviews http://www.dslreports.com/comments/3492


Bandwidth is insanely cheap compared to last-mile infrastructure. Bandwidth caps are basically just a way to soak the customers who use the service more and are, hence, willing to pay more.


Well if you have a lot of contention in your last-mile infrastructure bandwidth caps make sense. This applies to shared infrastructure like a coax cable network, or the connection up to a DSLAM.

Fiber nicely solves that though.


Why would the connection up to a DSLAM have more contention than a fiber-to-the-home network? Shouldn't most DSLAMs be using fiber already?


Yeah, there's no contention AT ALL up to the DSLAM. DSLAM is DSL Access Module. Every DSL customer has their own dedicated copper pair from the DSLAM to their house--- it's their phone line. There may be congestion between the Central Office where the DSLAM lives and the rest of the internet, but in practice that rarely happens. The big limit on DSL is getting reasonable speeds a mile down the line over ancient copper pairs that were never intended for the purpose. Don't people remember the old Cable vs. DSL commercials, where the DSL companies were showing cable customers furiously calling each other bandwidth hogs and telling them "log off!"


DSL seems to somewhat taper off at 100 MBit/s over here in Germany, but that's actually quite reasonable speed for non-business customers, with 1080p taking 27 MBit/s.

I once had cable in a crowded neighbourhood, it was pretty terrible at peak times.


Isn't the speed of DSL highly dependent on the details of the actual analog line?

I have an ADSL line, and my contract says "50Mbps", but I rarely actually get over about 4Mbps. As far as I can figure, the reason is because my house is near the limit of distance from the central office... (the phone company actually has a calculator web app thing that lets you plug in your address and get the line length...and then a separate calculator that maps from line-length to expected speed)

[I'm not complaining or anything.... It's a cheap line which I've had for ages, they're very up-front about the caveats of ADSL, and I could easily get fiber or cable (from multiple providers) for a bit more money. So far I just haven't cared enough to change it.]


Cable could easily offer these rates by using the bandwidth currently occupied by their tv channels. If comcast had a unified digital backbone it wouldn't be a problem. It is only because they have monopoly power. I find it completely distasteful the number of laws that prevent other entities from passing laws in certain domains. Washington state has a law that prevents cities from legislating any form of rent control.

We need to clean out these bullshit laws.


> I find it completely distasteful the number of laws that prevent other entities from passing laws in certain domains. Washington state has a law that prevents cities from legislating any form of rent control.

Are you serious? Rent control laws are awful, destroy the incentive to maintain housing stock or build new housing stock, and ultimately raise rents for everyone except the lucky few who can get rent controlled apartments.

Municipalities are just organs of the state, and state legislatures are entirely justified in preventing them from enacting myopic legislation.


I'm with you on rent control, but states banning municipalities from owning their own broadband networks is at least as myopic. Thea rgument is that publicly owned utilities crowd out competition, but we have abundant evidence for the argument that (unlike typical residential rent markets) utility markets are dominated by oligopolies who are easily able to engage in anti-competitive behaviors without having to form a cartel. Sure, in theory Comcast and AT&T might offer such competitive discounts/service upgrades that they cross some threshold and trigger an all-out price war that would be good for consumers, but I'd put the odds on that actually happening at close to zero.

Now I'm not such a big fan of municipal legislation - as a European, I think American democracy may be too decentralized and exacerbate the public choice problem, but I don't see a clear organizing principle that will only prevent economically bad legislation while allowing economically good legislation. The proposals to ban municipal or county internet services in North Carolina are just as mypopic as rent control laws in many Californian cities.


> Now I'm not such a big fan of municipal legislation - as a European, I think American democracy may be too decentralized and exacerbate the public choice problem, but I don't see a clear organizing principle that will only prevent economically bad legislation while allowing economically good legislation.

Operating utilities (especially last mile internet) at the municipal level makes much better sense than federalizing it. The transit market is sufficiently competitive that there is no need for the last mile operator to own a national network. Meanwhile with a national network operator, if the service is unusually terrible in a specific city there is nothing that city's residents can do about it because even when they're all in agreement they don't have enough votes to move the needle at the federal level.

And then you can have laboratories of democracy: If one city funds their network entirely through subscription fees and another funds it entirely through property taxes then maybe we discover that one works much better and other cities can start doing that.


Honestly I think you're way off base on rent control. I'd love to know what experiences you've had with rent control that have led you to such a negative POV? Perhaps living in Philly, as you profile mentions, has blunted you to some of the benefits of RC in housing-crunched cities.

The biggest fallacy I hear from people who don't live in one of these housing-crunched cities is the line.. everyone except the lucky few who can get rent controlled apartments. Rent control laws in SF, for example, are very inclusive. I mean, RC apartments can't be both rare AND pervasive enough to "destroy the incentive" to invest in housing, right? And every RC law I know of (which is only a few) has exemptions for new construction. In SF it's anything built after ~1980. In NYC I know it's back much further than that. So it doesn't do anything specifically to depress new construction.

Fundamentally, the question is do tenants have rights or are they at the mercy of the property rights of the building owner? In SF we believe tenants DO have rights. And you know what? RE investors here, where a rent controlled 2br apartment will set you back at least $3k a month in the city, are doing quite alright. They don't seem too desperately in need of expanded protections under the law.


    everyone except the lucky few who can get
    rent controlled apartments
No, that's not the problem. As you say, rent control can't be both pervasive and rare at the same time. The real problems with rent control are:

* It's hard to move. Say you change jobs and now work in a different part of the city, or your kids move out, or you'd like to move in with a partner. Normally one of the benefits of renting is that you can move, but with rent control that means switching to a place that's much more expensive.

* Landlords benefit from having their long-time tenants leave. Without rent control landlords love long term tenants because they're reliable and they mean less work finding people for the apartment. Rent control reverses this, and the landlord loses the incentive to upgrade the apartment and otherwise keep the tenant happy. Yes, the landlord is being a jerk when they do it, but sometimes the best fix for widespread jerk-ness is changing the incentives.

* It keeps out outsiders. People who want to live in your city enough that they're willing to give up their existing local connections and start over in your city are really valuable, and rent control means they pay a lot more than people who've lived in the city longer.

* It depresses new construction. Yes, I know new buildings aren't subject to rent control, but in a place where everything is already built out you can't put up a new building without taking down an old one. Rent control means you either have legal restrictions saying you can't replace buildings, or you have protests against people doing this and displacing existing tenants. But in a growing city, without new construction housing is going to get more and more expensive.

* Rent control puts off dealing with the problem of housing costs. If everyone in SF rented, and everyone had to pay market rates, then there would be the political backing for changes that would make housing more affordable. Rent control means that the long-time community members who would be best at this political change don't really feel much urgency because they have nice cheap places with rents set a decade or two ago. (But to be fair, homeownership is also a problem here, because rents and property values move together and homeowners want their property to become more valuable.)

    RE investors here, where a rent controlled 2br apartment
    will set you back at least $3k a month in the city, are
    doing quite alright. They don't seem too desperately in
    need of expanded protections under the law.
Rent control hurts renters. I don't care about real estate investors. That $3k isn't why we need rent control, it's because we have rent control.


  Rent control reverses this, and the landlord loses the
  incentive to upgrade the apartment and otherwise keep the
  tenant happy.
I'll stop you right there; while that may be what happens in theory, it certainly hasn't seemed that way to me in practice.

Perhaps in a market where there is sufficient supply for demand, but in a market where demand is sufficiently high, many property owners don't do more than what is legally required and keep raising their rents.

You would not believe some of the dumps I've been shown by smiling realtors or owners that wanted thousands of dollars per month -- I'm talking about $2500+ a month for a two-bedroom apartment with boarded-over holes in Windows, holes in walls, a terrible smell and carpet that looks like it was last replaced decades ago.

Time after time, the pattern I've seen from experience in the Bay Area has been that owners will do whatever the market will bear -- in a market with severely constrained supply, the market is forced to bear quite a bit.


    If everyone in SF rented, and everyone had to pay market rates, then there 
    would be the political backing for changes that would make housing more 
    affordable.
This is demonstrably not true in London where rent just keeps on going up and hardly any new housing is being built. For example, an MP has just resigned his £130k+ (total remuneration, that's $215'000) job because he can't afford to rent a 4 bedroom house anywhere near Westminster [1].

Sometimes the state just needs to step in and actually help people (by directly building high quality housing and selling it to people below market value). Currently if you buy a new build in London it is priced so high that you can expect it to be under water for a few years (the builders provide special mortgages because banks refuse to do them directly in many cases - the flats are not worth what they are being sold for).

[1] http://www.theguardian.com/politics/2014/aug/11/tory-foreign...


    This is demonstrably not true in London where rent
    just keeps on going up and hardly any new housing
    is being built.
http://www.london.gov.uk/priorities/housing-land/renting-hom... claims only a quarter of people in london rent? So the "if everyone rented" bit isn't being met. Because property values and rents mostly move together, as long as most people own their homes you have more people backing policies that push them higher than lower.

I don't know the London political situation very well: what makes it so expensive to build new units? Are there strict historical preservation laws? Rules that let neighbors veto or slow nearby construction? Permits and taxes?

    He can't afford to rent a 4 bedroom house
    anywhere near Westminster
"Anywhere near Westminster" apparently means "within walking distance". A 30min public transit commute covers areas where renting a 4br would be ~£26/year.

    you can expect it to be under water for a few years
I'm not sure what you mean by this. You're saying if you build a flat and sell it then people are buying them at prices at which they wouldn't be able to resell them? Is this just because there's a premium for new construction? (Cars in the US, for example, lose ~20% of their value when they're bought for the first time because they stop being "new". So you could say that if you buy a new car you expect to be "under water" at first, but no one would say that.)


You've clearly given this a lot of thought, but we are at odds with one another. I hate point-by-point refutation, but just a few thoughts I have...

All of your points are predicated on the notion that rents would decline if you eliminated RC. I think that's... hopeful. I mean, in some places you seem to suggest that landlords are beset by low rents (not improving the place because of it, hoping their longstanding renters move out). So certainly all of those landlords would raise rents. And non-RC units will, what? Reduce their prices to equalize the market? I doubt that. And now you have higher rents all over the city. And that's good for people? Because it might spur more development? But development is slow and favors up-market construction.

And blaming lack of development on the tenant-displacement laws that are part of rent control? And suggesting that rent control is the problem is a farce. If it's these laws that are the problem, why not write a post assailing THEM? And rent control laws are the reason people protest these developments? Really?

Being hard to move... ok... so go tell the family that you're raising their rent so that they'll get used to paying market rate. That way they'll be able to move anywhere they want! What a great deal for them!

I know for some reason you think rents will go down. That developers will build new buildings at high expense in this seismic-zone and then undercut the existing market (of old buildings with tiny rooms and few modern amenities), ignoring the fact that if they build up-market they can write leases at $50/sqft and have no trouble renting out the entire building well before it's finished.

Finally, if you actually read rent control laws, you'll see that they are not these cartoonish works of absurd leftist ideology, they do make carve-outs and exceptions for capital improvements and other large expenses to be passed-thru as bigger rent increases to the tenants. They are surely not perfect but property owners are a very powerful bloc and their voices are heard in the law IMO.


Yep totally serious. Cite your claims if you have them. Municipalities have plenty of competing laws to dis/incentivize housing stock. We have buildings that haven't been updated in 30-50 years raising the rate of a studio apartment from 850 to 1250 in the span of a month.

Libertarian nerds aren't the only people in society.


> Cable could easily offer these rates by using the bandwidth currently occupied by their tv channels.

Not exactly. It's true that a coax cable can, over distances of hundreds of metres, carry up to a frequency of 850 MHz at a fairly low attenuation and, at greater attenuation and/or ideal circumstances, up to 1.2 GHz in a useful capacity. This may translate, as per Shannon-Hartley and QAM into a raw bandwidth of multiple Gbps.

However, the topology of most FTTLA deployments doesn't imply that any individual customer is guaranteed this speed. In many deployments the LA - last amplifier - will be shared among hundreds of customers. In some cases, it may be only dozens. In some, it may be in the low thousands.

While the TV multiplexes are always going to take up a huge amount of space, depending on the deployment easily more than the majority, it's also becoming more common, for example, to dynamically switch multiplexes onto the spectrum on demand only.

If you have a healthy contention ratio, quality coax, few people on your LA, it's a good time of the day and you have a sensible provider then yes, it's possible to get a Gbps over coax. But the stars need to align to get that magical moment when with GPON or especially 10GPON it's basically a guaranteed thing.

Comcast isn't rolling out Gbps speeds because they have a monopoly power and want to artificially constrain total bandwidth (well, they kind of are but in a different way), but because it would imply rebuilding their FTTLA deployments deep into the field to basically take them halfway to FTTP/FTTH.

They're not offering Gbps because they're a monopoly and hate selling bandwidth first, they're not offering it because they're a monopoly and hate investing money in infrastructure. After that, it's definitely the hating bandwidth thing.

If it was the former, they'd be able do a far better job competing against municipal broadband deployments and Google Fiber than they are in everyone's mindshare.


> hate selling bandwidth first

So we pretty much agree then. They might not get to 1Gbps but the 50 Mbps I currently get (latency is horrible) is derived from a handful of channels. There is at least 600Mhz of more bandwidth sitting there. They could uncap the local network and allow other providers to tap into their last mile for a fee.


What? No.

Comcast has their network reliably built out to 750 MHz. If you think there are 600 MHz unused on that, you're left with 150 MHz. Which is about enough space for about 20 NTSC channels and the FM band. I can assure you that there's basically no HFC/CATV provider in the world, never mind the US, with 600 MHz unused just sitting there.


I use the term "sitting there" very loosely, meaning TV channels that no-one in my neighborhood is watching. From my completely unscientific poll of people on my block that I interact with (selection bias) most do not use the channels, Hulu, Netflix, Amazon Prime, torrents, etc. The only person I know that actually uses a TV does so with terrestrial signals, but she is 5 std dev away from normal.


It's cheaper to lobby the government to prevent competition and maintain the status quo than it is to upgrade your networks.


"I don't know of any city where the local cable co offers anything like that"

Any internet provider in Moscow. It's actually even cheaper.. US internet and phone service prices are ridiculously high.


As a Canadian: It is unfortunately not just the US where internet service prices are ridiculously high, but all of North America.


I agree it's cheap by US standards. My 100 Mpbs (symmetric, fiber) is $42. It's in Sweden though.


I don't suppose you have a few servers laying around, do you? :)


They do. Infact there is a lot of fiber infrastructure built in germany and that has been taken advantage of by business. I.e. Hetzner.


We ran our development server cluster from my basement on that type of 100 Mbps for a few years. It works.


I have something for you then! http://stackmonkey.com/appliance/new


My ISP in Seattle is similarly equipped/price and it's awesome. No contract, no data caps, no problems. Support is great too, a real person who knows their stuff answers instead of a machine. I got set up in about 2 minutes from hello to loading my first web page.

http://www.condointernet.net/


Including typos in the spam messages falls in this same category. If seeing typos in an "official communication" triggers your alarm bells then you probably would not fall for whatever scamola it's a part of. It'd be in their interests to get you to drop off early.


I imagine there still needs to be some balance though. There's likely a set of people who may fall for the scam even if it's ridiculous, but if made slightly more ridiculous they suddenly would become more suspicious.


This reminds me of a lot of malware like fake AVs and ransomware - very poor spelling and grammar throughout (E.g. "You Computer Is Infected!!!" comes up often.) Although in that case, it might actually be the extent of their English skills since most of this tends to come from non-native-English countries like the far East.


Why is it in there best interest for you to drop off early?

You probably won't fall for it, but there is a still a chance...


Opportunity cost. If you can devote a few hours a day to each of, say, three gullible marks, you have a much greater chance of a payout than devoting a minute to each of 500 random marks.


Wait, what? These messages are sent en masse, and they aren't really hard to write up.


That's referring to the time that the scammer needs if the recipient falls for their bait and initiates contact.

You only want the truly gullible to send that first email, or it would be a waste of time for the scammer to talk to all the people who wouldn't wire transfer their money a few days/weeks later.


Right, all contact after the first email has to be tailored to their responses. Even gullible marks usually need hours or days (at minimum) of building rapport before they're actually comfortable enough to be conned into executing a transaction. If conning people into directly handing you cash were automatable like phishing, you'd see a lot more con artists and a lot less of other crimes.


After reading your reply the whole paper made sense to me, thank you.


The initial contact is 'en masse', but the followups are all by hand. Time spent by the scammer to respond to potential marks is, in fact, a scarce resource.


Sending out the initial spams is very automated (and so cheap), but if you respond, they probably have a human handling than (maybe with templates, but still under human control), which isn't nearly as cheap, so they want to avoid wasting time on insufficiently-gullible responders.


This is precisely the point of the linked paper. Maximizing people who are initially attracted to the scam is NOT the best strategy for scammers, because most will likely be rejected at a later point, when it's costlier to the scammers.

The best strategy for scammers is to reject everyone but the most gullible targets as early as possible. Obvious typos would be suitable for this.


It's like how they say certain intro classes for majors are difficult to weed out students who would eventually drop out if they reached harder classes. The logic there is students aren't wasting their time taking classes they won't use when they switch majors, and the classes won't be filled up before students who will make it all the way can enroll.

This way the scammer isn't doing a back and forth for 2-3 emails with those who would eventually realize its a scam. They immediately weed them out so they are spending time on those who will payoff.


> Heroku needs to open source the testing it has used to claim 3X performance.

The bottom of the bar chart says: Database performance was measured using pg bench with 150 concurrent clients performing read-only transactions.

pgbench[1] is a performance benchmarking tool that is part of PostgreSQL itself. It ships alongside the rest of the PostgreSQL binaries and is open source just like the rest of PostgreSQL.

I think this line from the article is interesting:

> In addition to Heroku Postgres DbX we are launching new database plans with double the memory and speed improvements of up to 3x at the same price as our current plan lineup. These plans feature an upgraded and re-engineered infrastructure to drastically improve their memory and speed.

I wonder if the bulk of the price/performance gains are from switching to newer instance types on AWS and/or switching the underlying storage from magnetic disks to the new SSD EBS volumes[2].

[1]: http://www.postgresql.org/docs/devel/static/pgbench.html

[2]: http://aws.amazon.com/about-aws/whats-new/2014/06/16/introdu...


> This is all well and good, but why does everyone need to have their own damn app?

I can't comment on this new app yet but I'm a near daily user of the regular NPR app (which I think is officially called "NPR news"). It's way better than any generic radio app I've tried.

Besides listening to the individual local stations, you can also get the latest news (the ~4-minute hourly blurb), can create playlists of entire shows (without the interludes, i.e. just the content), and built-in news articles (for reading, not listening). I've even used the local NPR station finder to help me tune an analog radio when in a new city where I'm not familiar with the local stations.

It'd take quite a bit of agreement on content licensing and data formats to have anything like that be as usable as the NPR app across multiple content providers.

> Seriously - why does everyone assume that if I want to listen to e.g. NPR, I only want to listen to NPR?

Well some of us do :D


I agree with your sentiment when it comes to e-commerce and print publishing, but for radio, I'm already a big app-consumer.

Why? So I can escape the soul-crushing drudgery of Car Talk and Prairie Home Companion. They're good shows, but they're essentially the same show, repeated thousands of times. And no matter where I am, they seem to be what's on the local NPR station.

So I already have apps (or streams in the Radium app) for the CBC, BBC, Deutsche Welle, etc. If this new NPR app lets me listen without needing to constantly fiddle with my phone and select stories, I'll use it a lot.


> So if your Twitter password happens to be the same as your ATT password, you're out of luck.

Why would you have both passwords be the same? That makes no sense. All passwords should be different.

> I only use two factor authentication if I can add it to my Authenticator app and save the code/QR code somewhere offline. Everything else is just too complex to be secure.

TOTP based two-factor auth (e.g. Google Authenticator) is my preferred method as well though I'll still set up an alternative method if it's not available. For example Namecheap offers 2FA via SMS. While not preferred, it's better than nothing.


> Why would you have both passwords be the same? That makes no sense. All passwords should be different.

I'd put good money on the percentage of AT&T customers who use the same password for all their web services being sizable enough to make that a valid concern.


>Why would you have both passwords be the same? That makes no sense.

Sure it does. With only one password you are less likely to forget how to log in to an account. It makes perfect sense.

>All passwords should be different.

And eat veggies with every meal. And properly hydrate throughout the day. And get 40 minutes of moderate cardio at least three times a week. And call your mother more. And floss daily.

You can't expect all the people to follow the "best" advice all the time.


Does anyone know of a city that has an open API for accessing whether or not there are cars parked at it's metered spots[1]?

For cities that haven't switched to shared muni-meters, the existing parking meters seem like the natural place to add hardware that checks for availability. If the meters are network connected then at the very least they could report back whether or not they are in use or "expired". A lot of the newer meters allow for credit card payments so I'm assuming they have some kind of network connectivity.

It'd be awesome if a city provided this data to the public. Then anybody could just make an app/website[3] that shows the closest place you can park. Forget trying to make a buck off of reselling your existing spot, the savings on traffic, pollution, and frustration would justify it (less needless circling).

[1]: All spots would be even better but I'm assuming a spot that doesn't have a meter isn't going to have any hardware there that could tell whether a car is present.

[2]: http://en.wikipedia.org/wiki/Muni_Meter

[3]: Or even better integration with your smart phone's existing maps/nav app.


San Francisco had real time information about a large percentage of it's parking spots and it was available as an API. Apparently, though, this is no longer the case:

http://sfpark.org/how-it-works/open-data-page/

"As of December 30, 2013, the parking sensors in the street will be turned off and their data feed will no longer be available as parking sensor batteries have reached the end of their useful lives. This means that the real-time information on parking space occupancy will not be available for mobile apps and similar uses. The SFpark data feed and app will continue to show meter parking rates, as well as real-time space availability and rates at parking garages."


Yes, Montreal. Every single possible parking spot at a busy Old Montreal city has a numbered spot and it has either parking pay machine nearby or iPhone/Android app payment methods.

It is a really great solution.


Seems like it should be a FIFO queue weighted on verified proximity/travel-distance to the locale you are trying to park in, if such a system were to be put in place.


Singapore has it to some extent. ParkerMeister (one of our apps) is using it.


Besides the usual scaling they'll get from the larger org and cross marketing (e.g. I'm guessing people who book with OpenTable travel more than the average person), imagine Priceline combining its "name your own price" concept with OpenTable's inventory of restaurants to offer prix fixe meals. Pick an area of town, a type of food, headcount, and a max price and they match you with a restaurant that can accommodate.

Not sure if it's a workable idea (I honestly can't see myself ever using it) but it's an interesting angle to think about.


It's also worth nothing that some ~90% of the Priceline group's revenue (iirc) is booked through Booking.com, which is one of the top travel sites in the world (and remarkably unknown in the US). Not everything Priceline does is the "name your own price" gig.


Interesting. We used booking.com extensively when travelling in Europe because most of the smaller hotels use it as their booking engine. (And we also heard lots of complaints that they didn't like using it, because the cut was too big... but they were still using it.)


Booking.com is the dominant player in Europe with something approaching 50% of all online hotel bookings going through them. They are unique in that they have tens of thousands of small hotels that use them exclusively as their booking engine and don't distribute to other online travel agencies.

Booking.com uses an agency model so the hotels pay them a commission that tends to be lower than the margin they share with Expedia and others on the merchant model. However, since Booking.com has so much leverage in the marketplace, they have started charging more for premium placement, etc. that hotels have to comply with if they want exposure to their massive audience.


That's a good idea for growth, might help them attract more price sensitive users who probably don't dine out as much as their average user.


We already tried this with groupon, resteraunt a have no benefit in attracting price sensitive customers aaas the margins are already so low


Groupon may not have lived up to the hype, but they're still putting millions of people in restaurants each year. And part of the reason they didn't live up to the hype is because their deals are a lot better for the merchant now (70% of revenue going to the merchant instead of 50%).

There is a definitely model and need for filling empty seats at a restaurant. It just has to be done profitably over the variable costs of food (significant) and additional labor (much less significant).


Yes and open table is already doing it. They reward repeat customers who pay full price (their point system) not try to generate new customers who are a poor fit for long term business by discounting their meals.

Discounts are for getting rid of excess merchandise, not acquiring new customers.


Restaurants do prix-fixe menus all the time.


That sounds like an automated version of various cities' Restaurant Week promotions, where participating restaurants offer promo prix fixe menus (sometimes all the same price, sometimes different price levels allowed).

Houston's was such a big hit that it's now Restaurant Weeks and lasts a whole month.


> Not sure if it's a workable idea

This was the first thing I thought of. I do think there might be a challenge convincing people that they should lock themselves into a restaurant (assuming it works the same way as with hotels) though.


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: