Let me plug OpenLayers, which uses OSM and is way harder to use than google maps but once you stop hating on it you realize it can do a bunch of things that google maps really can't and is more versatile in numerous ways - ymmv as always. (https://openlayers.org/en/latest/examples/).
I've found it more useful for specific applications that I build because it's far more customizable. The map truly feels like it's part of the app and not just some mapbox or leaflet thing sitting on top of gmaps. Again, these aren't hard facts, just stylistic preferences on my part.
The ES6 is not really a dealbreaker if you're using something more traditional. I just make my own "map.js" that exposes the interfaces I need, then I transpile it and include it like anything else.
I know that sounded really complicated, but my package.json is 10 lines, that's it.
This way you can use it in a much more conventional classical way without jumping the whole project over to ES6 syntax.
I find that if you need more complex mapping features then OpenLayers is really good. It can do an amazing amount straight out of the box, but it does take time to get used to the API. It is more like an actual mapping or GIS program.
But for building functionality outside of the map I tend to use Leaflet instead due to the ease of use. You can treat the map more like a black box and the API is easier to use.
Surprised that no one is talking about ways to leverage the server so you’re not doing all the work on the client. You can shift layers by precomputing either raster or vector so you don’t push your poor, maybe mobile client to the max.
Classically you would do this with Tilemill for rasters which was amazing software, if crashy. And since we’re on a mapbox thread what I think put them on the map. I wish it was still actively developed and adapted for vectors.
The viability of using Google maps really comes down to the application. An application with a lot of users that rely on the usage of street view can become very expensive.
Nice quote: “Don't lose out on the potential of tomorrow by thinking too big today.”
Most production raster tile systems use osm2pgsql + Mapnik. These are not easy to install. You may need to generate two or three sets of raster tiles for different pixel densities. The total number of tiles will explode past around zoom level 15. https://switch2osm.org is a good intro.
The newer way to do raster tiles is to actually generate vector tiles first, and then do the rasterization dynamically. Mapnik can do this, renderers like MapboxGL can be run in headless mode to do this as well.
In addition, modern browsers can display vector tiles directly. Mobile and apps have a somewhat harder time with this. https://www.maptiler.com/ has a good introduction for this wrt Open Street Map.
I believe, but haven't bothered to profile, that they're lighter. You can also pre-"render" the vector tiles and store them in eg S3. All styling is done in-browser, which means you can do neat things like swap out the styles client-side for dark mode without needing to reload map tiles.
The primary standard for vector tiles is Mapbox Vector Tiles (it's an open standard), which is really just a sqlite database following a particular schema.
My own personal experience developing against both png and vector tiles over the past few months is that vector tiles are better in just about every way for the kinds of things you use maps for in apps. I expect that some dataviz applications, though, would be better served by raster tiles, since you'd end up generating a raster overlay anyways.
It depends on the application, but I don't think they're necessarily larger in size. I believe the format is quite compressed (Mapbox Vector Tile/MVT, which I believe is a variant of protocol buffers).
Many vector tile encoders have wide flexibility in geometry simplification/removal depending on zoom layer and this helps big time with tile size. I find them to be quite nice in lower bandwidth situations because you can overzoom them and still have them look good: if you zoom in and the tiles for the next zoom level haven't loaded, the vector geometry is still able to look decent, rather than pixellated like a raster tile.
It might be that pbfs are stored inside sqlite? The whole thing is really incredibly confusing to be honest.
What I can tell you is that my vector slippymap XHR is requesting .pbf files, so that would definitely imply protobuffers as the wire format. Ultimately that's using t-rex and mapbox-gl.js, so... not sure if there's something intermediate. But the mapbox docs say sqlite? I'm just very confused now.
Yep, versus https://docs.mapbox.com/help/glossary/mbtiles/ -- I had them backwards. My confusion might have been because of the name similarities (mbtiles vs mvt), which took me 3 tries to even just type correctly.
You can map layers using different sources. You need to use fromlatlng to do the mapping. You can define your source so long as you use the tile server protocol (TMS).
You don't need everything, you just provide the areas you want and layer accordingly
I'm on my mobile typing this but hopefully that's enough to get you started.
It should probably take less than a day. There's certainly nuanced GIS wonkiness, let's not pretend otherwise. But when I did set up a tile server I remember it being one of those things you think you simply cannot do without taking like 6 months of cartography night school, but then 45 minutes later it's done.
The ___domain specific knowledge requirements for standard usage required lightly scanning a couple Wikipedia pages. nothing serious.
I do a cycle map with my own style and these are my steps and tools:
1) Download a region from geofabrik.
2) Extract, transform and load the data into pgsql with osm2pgsql. I use the osm2pgsql flex mode. QGis can load osm extracts direcly, but the resulting layers have many errors in the details and using pgsql is much faster while viewing and rendering. Imposm is also a good alternative to osmpgsql.
3) Use QGis for styling the map.
4) Generate static tiles with https://docs.qgis.org/3.10/en/docs/user_manual/processing_al... (With DPI 250 and a tilesize of 512x512 for use on high resolution screens.)
5) Upload the tiles to my own server.
6) Load the tiles in OSMAnd as a custom map.
(Disclaimer, I work at Cesium) we've been working on making this easier. If you have a bunch of GeoTIFFs of satellite imagery you can drag and drop them into our platform and get WMTS/TMS tiles you can stream into whatever client.
I feel like this is the story of open source tools. Way harder to use but once you know the tool the power and freedom is much greater, and even the pleasure of using it.
Years ago I used openlayers and osm to develop an online map to display data for a research project I was working on. I originally considered using Google's api, but their licensing at the time was incompatible with the terms of our grant regarding our collected data.
I enjoyed working with OSM, it was actually my first experience building an application around an external API. It was a fun experience and i was fairly pleased with the way it turned out.
It ended up going live for a small time, though, we ended up running into a snag with the government after finding an endangered species in an active mine site and were forced to remove our public data and submit it all to a private government database locked behind fees.
Let me plug OpenLayers, which uses OSM and is way harder to use than google maps but once you stop hating on it you realize it can do a bunch of things that google maps really can't and is more versatile in numerous ways
I have Open Street Map on my phone, and "harder to use" describes it well. The problem is that for me, and, I'd bet, for many others, "easy to use" is the most important thing a map app can do.
The conference this is advertising is rather buried in this long article. It's tomorrow, online and free, and the talks sound interesting and varied. Schedule below:
Thanks - lol, hopefully this comment doesn't get buried as well. mapbox, ESRI, Geotab, and Grab will have "sponsor booths" as well. See https://2020.stateofthemap.org/#sponsors
Sweet, it's in Cape Town - well, all online, but I happen to be in the same timezone (opposite hemisphere), and the times of the talks are perfect for catching them during the day this weekend.
A confusion that I had at first before going further into the Openstreetmap ecosystem: OSM is not a ready to use app and some alternative to Google Maps the app, it's a lower level database on which front-ends renderings with a tileset can be built to serve different usages. If you go to openstreetmap.org you will not see much, but click on "Edit" and zoom in close and you'll get a better idea of the data layer.
Edit: to further clarify the stack, Mapbox is a collection of tools for creating maps from OSM data.
That is rather wonderful but, of course, one typically can't use telnet at work "because security". I got left with it in a mode printing escapes afterwards, and needed reset(1), but I guess that's telnet's fault, unless TERM is wrong.
I think starting with Leaflet and building from that should give you a good understanding of the building blocks for web mapping: https://leafletjs.com/examples.html
For anyone looking to migrate to an OpenStreetMap-backed map and ___location data provider, https://switch2osm.org has a good overview of how, why, and who is offering services.
I’m partial to my own (in bio), of course, but I would be happy to answer questions for anyone looking to switch.
Hm that's pretty weird. I have had both a technical and a billing support request, both have been answered within a day. Including a technical analysis on what likely caused a bug. No support contract or payment either.
Because user expectations are high for interactive web maps - zooming and panning should retrieve new data near-instantly.
OpenStreetMap is an unwieldy but not "big" dataset - it fits easily on a consumer grade SSD. You need to index the dataset so that retrieving a specific slice is extremely fast - doing point in polygon tests, clipping operations on source geometries that have tens of thousands of vertices, etc should not happen at query time. This inevitably means pre-rendering as much as possible.
In addition, this needs to work for every intermediate tile level when zooming out - each parent tile covers 4 child tiles, so you need some strategy for decimating the amount of data so that the tile sizes don't increase exponentially as you zoom out. This is beyond a pure computer programming problem and becomes a visual design problem as well - features such as roads or transit layers should form a sensible hierarchy with less important features removed.
I've been working on this class of problems for a couple years now and have a related presentation day 2 of the conference mentioned in TFA - details in bio if you'd like to talk more
> OpenStreetMap is an unwieldy but not "big" dataset
Do note that while not BigData big, it _is_ large. Your PostgreSQL database (postgis) on your macbook won't be able to import `planet.pbf` in any reasonable time (think: weeks). Your digital-ocean VPS won't have enough diskspace to process a weekly planet.pbf and the free or cheap tier of your RDS won't be able to handle planet.pbf either.
http://download.geofabrik.de/ (also one of the StateOfTheMap conference sponsors) is a good place to find "chopped up" downloads to avoid that: just get only your country, a province or even just one city. On my developer env I always run everything through with `luxembourg.pbf`. or `iceland.pbf`.Luxembourg-latest does still contain nearly 2.5 million datapoints: `osm2geojson luxembourg-latest.osm | wc -l #=> 2417784`
You are describing a problem inherent to using PostgreSQL, not OpenStreetMap itself. My presentation is specifically about my solution to this, which can easily import planet.pbf in 7-8 hours on a laptop with SSD, and can cut extracts for cities and countries like you describe based on minutely data:
Indeed, a lot is due to PostGIS. I'm using mimirsbrunn[1] a lot, lately, with Elasticsearch as storage; especially because it is fast. It does lack a processor that listens to changesets and imports those, though; so I still need to run a nightly rebuild of the entire database.
Thanks for pointing me to your StateOfTheMap session. I'll "attend" it for sure.
The difficulty depends on what size of market you're trying to serve at what reliability target with geographic constraints, how often you want to update your data, and how many map tiles / second you want to serve.
Consider a few different scenarios:
1. If you're trying to serve a single website and map style, with limited traffic in one region, this fairly easily accomplished using a single map tile server using one of the established methods (something like 8 cores, 32GB of RAM, and a few hundred GBs of disk will get you most of the way there). The difficulty is mostly determine how you want to set it up and then provisioning the server to do it. This process will probably take you a few days. Updates will be infrequent, map data will be stale, and speed will be tolerable in your targeted geographic region.
2. Let's say you take a step up. You need to serve global traffic with one style, but only a few requests / second. You can still do this via a single tile server, but now you need a CDN to make it fast. Cache misses will be slow since it has to render and then send the tile to the CDN (up to 500ms of latency due to geographic constraints and cache fill times).
3. Let's say you now need multiple styles, global traffic, 10s of tile requests / second, and good reliabilty. Now you need to distribute your servers, so you'll probably need three or four servers of at least the above mentioned specs, you need to keep them in sync.
4. Now let's say you need multiple styles, global traffic, and up-to-date data. You'll need to keep the OSM data stored locally so you can update it (minimum 500GB in Postgres + PostGIS). You need to distribute this all around the world. You need to handle updates, but do them in sync.
All of these are short of actually offering a commercial service for multiple companies, since in that case you need a global service structure, a CDN that's pretty efficient, enough tile servers to handle spikes in demand (pretty common in map tile services), and probably a couple layers of cache to keep everything moving smoothly.
There are a few projects that make this easier. For instance, openmaptiles.org is what our solution was modeled after (and we still use their schema, mostly unchanged), but in the end it's hard because it's a lot of data, that requires a lot of compute (either up-front or at a moments notice) to be responsive, and requires a lot of orchestration to keep it all well-oiled.
In addition to everything others have already said, I think one of the hardest parts is continually dealing with occasional upstream outages and API changes. Initial automation setup is exciting, repeatedly revisiting it after it worked just long enough to forget almost everything is a chore. It's a good fit for "as a service" outsourcing to full-time experts.
Because it is not just installing various server side software components and setting up client software to consume it - you also have to deal with processing data to do it. Once you have done it a few times it is quite easy, but the amount of details for the beginner can be overwhelming.
Just wondering if I could get some suggestions from the community
I've been asked, preliminarily, by the lab I work for to produce an application that will overlay daily/weekly water quality measurements on a map. I've got no idea where to start as I've never done anything GIS - but I do like keeping things open source. Would this be something to integrate with OSM?
The problem seems simple and generic enough that I'd expect it can be done with zero programming. I just need to find the right place to feed in a table of coordinates and measurements
I know the problem spec is a bit vague, but what's a good place to start?
And if I do need to write an app, is there any way I could stick Clojure/JVM?
For more complex GIS-data exploration, visualisation and research, I'd suggest QGIS. https://www.qgis.org/
It is not the most userfriendly (aimed at GIS professionals) but it does have a large, open and friendly community around it, making lots of tutorials, manuals, introductionary material and so on.
Edit: especially your type of "problem" is what qgis is good at: you have a CSV, maybe a postGIS database, some old scans of maps maybe; a government provided shapefile of the waterways and so on. And with qgis you can all project them over Google/OSM/Bing/Mapbox maps, mix, mash, filter, extract and so on. It is a desktop software, so publishing your result would probably mean "make some PDFs" or "render a set of PNGs" or so.
I'm a complete novice when it comes to maps, GIS, and all that. (Though I like to consider myself technically literate in general.) I had no problem getting started with QGIS for that stuff. It is fiddly, of course, but the software was clear and well documented enough.
There are many different ways of doing this. What approach makes sense depends on the skills you have available and the scale you intend to do it at. There are several good examples suggested.
If you are looking for country scale water quality mapping, the it may be worth looking at the work of Akvo Foundation. We have helped more than a dozen countries, in collaboration with UNICEF in many of them, to do drinking water quality mapping in rural areas, on our open source service. We have covered an estimated 100+ million people with surveys. We also work with water quality in lakes and streams. Our mobile phone app is integrated with a suite of water quality sensors and testing methods.
If you want to do this without programming, you should upload your dataset into the Mapbox UI and create a new layer to overlay on a customizable basemap. Their free tier will probably cover you unless you're expecting a lot of users.
If you are expecting a lot of users, you'll probably have to program
The last time I looked for an alternative for Google/Apple Maps some months ago, I couldn't really find a decent OSM (or even Mapbox)-based app for iOS. Any recommendations from anyone?
I really like the web-based Openroute Service [1] that was shared here recently, but AFAIK this doesn't have an app version.
Personally, I use Maps.me for simple tasks like finding street, shop or other POIs. OSMand is my favorite tool when navigating during cycling trips because of far more advanced rendering options. Both apps are great.
Osmand, as the name implies, is Android only, though. Not a problem for me, since I use the fdroid version, but OP was specifically asking for an iOS app.
EDIT: as below, I stand corrected. Please ignore my false statement above.
I think the issue is that for the mass market, OSM is a technical detail that the apps only mention in passing, if at all.
The best iOS apps with OSM I have found are [Guru maps] (formerly Galileo) and [Pocket Earth]: Have been using paid versions of both for the last 5 (?) years, switching back and forth. Currently on Guru after they added contour maps and offline routing a while back. Haven't checked if Pocket has this yet...
I found OSMand to be terribly slow on my old phone and maps.me sort of OK-ish in performance. My phone is quite old. Searching locations in both of them is not perfect. One sometimes needs to know how to search for things and treat the UI carefully, patiently waiting till it has finished showing results.
Searching locations (geocoding) is never perfect. Google seems to be the best at it, though. Probably because they've put significant effort into doing that specifically and they simply have access to more data than any other organisation. OSM isn't about geocoding, it's about building a geographic database of the world. Geocoding is just one application and not one that seen a great deal of attention, relatively speaking. Personally I often use Google maps for geocoding if I have to but most of the time I already know where I'm going on the map so have no need for it.
Unfortunately, Maps.me on android is either replete with trackers or now basically doesn't work (the f-droid expurgated version), though the question wasn't about android. At least that was the case with f-droid fairly recently -- due to changes in how the maps are now distributed as far as I remember.
I tried making an app that would leverage OSM late last year and was disappointed at the amount of false or stale information in there.
I went into the experience thinking it was the wikipedia of maps. But it's not, wikipedia has much better content control than this.
My theory is that it's too easy for random people to enter data into OSM. It should require some sort of validation of the contributor.
And each contribution should have a discussion, just like wikipedia talk pages. I know that sounds like a lot of talk pages but I believe some of those contributions require community discussion to improve.
Examples, please. There is bad info here and there, but it's minority (if you want 100% accuracy it's delusional, even Google Maps has a ton of crap, starting from skylights that AI recognized as buildings [3] to businesses you can't even tell they exist).
Also, for a long time now each changeset can be discussed. [1]
I have a hunch you may be from a country where the community has not took up too much [2]. These tend to have more dubious data.
There's a number of us who use this kind of data for building back country cycling routes, and while it's great guidance, there's both the problem with roads being listed that are no longer passable (nor often even visible at ground level), or other routes available that don't exist.
Of course, this gets back to the way much of the data is crowd sourced... And we're working on that. :)
Anecdotally, in Northern Wisconsin, GMaps is only slightly better than OSM on abandoned logging roads. There are a lot of private gated or abandoned roads and logging tracks which Google Maps will use for route planning.
Thanks, this is interesting. I checked a few spots and it doesn't seem to be quite helpful for what I'm looking for (seeing if a road has had brush grow back in over it making it impassable) but this would be quite useful for other things.
*in the US - it's a TIGER-specific thing (and certainly not exclusive to OSM). It doesn't affect the rest of the world.
Interested to learn what you're doing with cycling routes - I've been working on improving cycling directions in the US (for my site, cycle.travel) for a few years.
Yes, the TIGER problem is, but I don't think missing (or wrong due to physical abandonment) is a US-only problem.
For cycling stuff I'm mostly ensuring that mountain bike-specific trails are in OSM and properly tagged (eg: highway=path and NOT highway=cycleway as some are wont to do), with appropriate intersection markers, names, etc.
My goal is to ensure that these trails are in OSM so they'll flow down to Garmin, Trailforks, Mapbox / Strava, etc. I also use the OSM data, exported to vectors, to make PDF maps to support the local trail advocacy/building org that I'm part of.
Print maps are made using osm2ai.pl to get the routes into Adobe Illustrator (AI) format. I then stylize things there. I know I could use GIS software, but QGIS is a pain to get things looking the way I want (similar to Michigan DNR maps) and ArcGIS is too expensive for unpaid volunteer work. It's also way more hand-off-able in AI format; I can send the files to any graphic designer and they can keep-on with it. GIS software requires far more specialized knowledge.
I wanted to use node data on shops, like if they take cash or not for example. Noticed there was already a lot of shops with data in there. But none of it was accurate.
Just one example is that an Apple store in Sweden was set to accept cash. Just for fun I called them up and asked if they take cash and they laughed at me.
It seemed to me that someone had automated input for businesses without actually verifying their data.
Not OP, but I'm in the UK and find the OSM data to be not all that useful. The streets themselves seem mostly accurate, but there is almost no building data.
If you had to rely on OSM data to get around, your best bet would be to live and work in northeast London, within the circular.
A great thing about OSM is that you can improve the data, specifically in your local area so that at the very least it will no longer have spotty coverage of the places you frequent.
OP talked about improving OSM using StreetComplete.
So this is about adding missing speed limits, opening hours, roof shapes, recycling containers, tactile pavement at bus stops, etc. Hardly "creating the map".
When I travel to a place where OSM could need improvement I make sure that at least the hotel I will be staying at, nearby
restaurants, bars and museums, etc are correctly tagged with phone numbers, opening hours etc.
It is not much work and it gives me an opportunity to decide what I want to do before I leave.
When I get there I then have everything ready offline in OsmAnd. In a way I just update OSM instead of adding my own bookmarks/favorites.
nelgaard got the gist of it. I've never encountered an area where OSM didn't have the roads and buildings already mapped. I have encountered areas where it didn't have the speed limits and house numbers of those structures (and, once or twice, places lacking a road name, but that's pretty rare).
But, my personal perspective was a little different. It's more like: I want to use OSM, and get others to use OSM instead of proprietary mapping solutions like Google's, for moral reasons. In that context, being able to make sure my friends can always get to my house using OSM is a great boon.
I realize I didn't share that perspective in my original post, which is probably the source of the confusion here.
YMMV really. OSM in the UK has orders of magnitude more cycling and walking data than Google, for example, but fewer buildings. I don't generally find I need buildings in my day-to-day mapping needs but everyone's different I guess.
I have found this in the U.K., Canada, Japan and Taiwan - OpenStreetMaps is way better as a pedestrian or a cyclist in my experience, at least in relatively built up areas.
Its the same in many parts of the USA. For example in a college town with 50k students / 50k permanent population, streets are complete with sidewalks and bike trails etc., but good luck trying to find nearby restaurants by looking around the map, like you can do with google maps. Even the buildings which are marked with their name and type usually lack operating hours, phone number, website url and other sorts of info that GMaps has. But the situation is much better - close to GMaps level - in big cities here.
The great thing about OSM is the currency. New developments will be added more qucikly. But for completeness and consistency Ordnance Survey data will just be better. And they do have a free building dataset, and even pre-generated vector tiles.
Even the OS data -- at least what I can see in Aurora -- might not be as complete as where someone has taken an interest in a new development, such as one near here. The OpenData building shapes are a good start, but they appear to have been auto-generated, and usually need correcting from up-to-date satellite imagery, if not a survey.
Whether buildings are mapped, and in what detail, just depends on the area; detailed mapping definitely isn't confined to London. The main utility of mapping buildings for navigation might be to tag them with postcodes, even if that's not ideal. Roads are typically at least as accurate as OS OpenData where I've looked.
This kind of statements are useless without specifying what kind of data was wrong and where.
In my city (Kraków, Poland) road data, bicycle infrastructure, pedestrian infrastructure is extremely well mapped. Shops? Probably about 3% of them are mapped.
China? You can find entire cities not mapped at all (for multiple reasons, starting from fact that mapping in China is illegal without permission from government).
It sounds like a good plan, however, there is the question of how you want to validate the map information.
Someone adds a street. What do you want to do to validate it? Is there some international OSM team, which goes to all the locations and checks, whether there is really a street? With Wikipedia articles it is simpler, because you can check sources mostly online or you have experts on subjects world-wide, not bound by ___location mostly.
A map of the world seems like sooo much work to keep updated, while many Wikipedia articles can stay as they are, if they are OK now, because they treat historical topics for example, or mathematical things. There are new developments in those areas as well, but no one will take a wrong road when driving, because the Wikipedia article was not updated. Or at least it is not likely to happen. By introducing additional barriers, the update frequency on OSM might be even lower. I wonder how many contributors there are for OSM maps. I have personally never updated a map there and have been too lazy to read up on how to do that.
One would have to find a clever way to validate map information, which does not inhibit participation.
There is no validation team, however everyone can check the latest edits [1]. If there's a developed community in a ___location, say someone in your city, you can validate it yourself. Someone can use Notes feature [2] and report that the street has inaccurate information - by linking to some photo proof.
Thankfully many (not all, like POI) edits can be checked via satellite imagery [3], of which OSM has permission to use many - global ones being Bing, Maxar, Esri, Mapbox. With so much sources of imagery, the update rate is acceptable for most places.
All in all, the result is not as bad as you'd expect. From normal people I hear that OSM is "very accurate" (at least here in Europe). Oddly enough, nobody says anything about blatantly fake info, which indeed does very rarely slip, but apparently gets reverted.
Now as I am thinking, there is "validation team" of sorts, that's Mapbox data team, which uses OSMCha [4] to flag suspicious changesets and checks them (albeit not by physically going there).
In many regions any bogus data will be spotted fairly quickly. New mappers are monitored by the community and guided to documentation if necessary, and real abuse is dealt with (either by just reverting vandalism, or blocking the user outright via the Data Working Group).
Areas with local mappers will be watched even more closely.
All in all OSM often reaches a level of detail Google Maps can only dream of. In the Netherlands the number of outright errors is really quite small, and at least everything is pretty much up-to-date. Google Maps is still showing (and routing over) a trunk road in my home town that closed over a year ago.
> A map of the world seems like sooo much work to keep updated
It's really not that bad. Major infrastructure draws the attention of people who are fascinated by highways or railway tracks. The same goes for national cycling networks and the roads and cycleways these use. Major administrative rearrangements (changes in the borders of municipalities etc.) also draw a particular group. And finally, local mappers care about their local piece of map, often down to the newest projects. As a local mapper myself I find it a joy to be the first to map a new street or add a newly assigned name to some unnamed way.
It's also a nice hobby, because you visit parts of your town to survey you wouldn't ever come otherwise.
> I wonder how many contributors there are for OSM maps. I have personally never updated a map there and have been too lazy to read up on how to do that.
I was linked to OSMstats [1] earlier - was fun to look through, and the
"Contributor to Elements Created" ratio made me proud to be a Canadian contributor :)
The answer is crowd sourcing. Everything one person creates or modifies needs to be voted on by multiple people. After x votes and y percent in favour of the change it‘s visible to everyone. The x and y numbers could change depending on the user‘s reputation etc.
Think of it like the stack overflow „tasks“ you are asked to do all the time: review this question, do this, do that ...
Because of the OSM data model, changes can't be really held in "unaccepted" state. One would run into editing conflicts[1].
On Wikipedia withholding edits until review works in some language versions [2], but because Wikipedia pages are "atoms" that stand on their own, they have it easier.
This would not work, in nearly all cases there is not enough of mappers to handle this.
And anyway real problem is a stale data - especially shops. Data that was wrong from start is rare, OSM has nearly no vandalism.
Making easier to verify data is much better idea (one of nicely working things is a StreetComplete Android app - about 20% of shops where I am asked to add opening hours are gone, so I open a note, take a photo and later me or some other mapper removes no longer existing shops)
In the outskirts of Lübeck, Germany, Google tried to tell me to walk through a private company area with a gate. OSM not only knew about this, it had far (far, far) more details about everything.
In Waterkloof Heights, Pretoria, South Africa OSM was roughly 7 years out of date and had barely any information even then.
> In Waterkloof Heights, Pretoria, South Africa OSM was roughly 7 years out of date and had barely any information even then.
I did some mapping on the Garden Route years ago. When I later went and looked back at those communities, I was disappointed to see that little work had been done since my own edits long ago. And in general, much seemed to have been added by other foreign overlanders like myself, and not any kind of local OSM community. I wondered if this is because those demographics in South Africa that are affluent enough to be interested in nerdy tech things like OSM, have also grown up in a culture that always warned them of idly walking around towns with phone in hand?
Meanwhile, tiny Lesotho (a landlocked enclave entirely within, and distinctly less affluent than, South Africa) has a passionately active OSM community uniting locals on the ground and international mappers who use satellite imagery. Last year it was officially declared[1] the first country to be "completely mapped" in OSM -- a moving target obviously, but impressive nonetheless.
Very possible. Waterkloof Heights is a very rich area, the only white people I’ve seen walking besides me were joggers. The black people were all there to work at the mansions.
When driving to a beach in Europe, google once tried routing me through a small two-story house. And possibly into a cliff wall, the house sort of blocked the view of how "the road" went from there...
I edited OSM for 10 years and in my opinion it depends of the zone.
In my zone the comunity it's organized, the people checks strange edits and also have comunity projects.
For example we detected that there was a lot of streets without name some years ago and we made tools to control the streets without name and worked on it, now about the 90% of the streets have a name.
As I said for me the problem it's that the quality of information it's not homogenous
You've hit on the central conundrum with OSM: it's a good enough proposition to make it interesting (free as in beer for many uses, acceptably accurate in many places etc), but a long long way from being a reliable, authoritative single source of truth. Which - let's be honest - is often true of Wikipedia as well, moderators notwithstanding.
Improving OSM is an area of active research I believe, but I'd suggest the scale of the challenge is of a different magnitude than can be met by the type of ideas you mention. Think about something like Google Maps, or in the UK OS MasterMap. Each of these likely has hundreds of millions of R&D invested in tools, techniques, processes and infrastructure behind them, as well as an ongoing budget in the hundreds of millions, and numerous, well organised, skilled, full time staff. There's a critical mass behind such efforts that as yet OpenStreetMap has not been able to muster, which is no criticism of OSM, but just an acknowledgement of the reality of the situation. We're talking about setups that take dedicated organisations years to develop - the history of the Ordnance Survey for example goes back to 1747!
This is NOT a role for the OSM[0]. In many cases it is the best source of map data, but it will never be the sole single authoritative source of truth.
Treating it as one is horrible idea and will end in tears.
Of course, a map will never be the authoritative source of a territory -- hence the saying. But what sets OSM apart from many other mapping projects is its focus on local knowledge and actual people. If people in an area call something by a name, then in OSM, that's its name, even if there's no official name for it, or if nobody really uses the official name.
Arguably a complete topographic map is an outdated concept. Trying to create a single picture that can fit on one piece of paper excludes a lot of information. And it is not a constraint we have anymore.
And often the things you want to map are better mapped seperately in their own dataset. Just consider woodland. Should that block of trees be mapped as individual trees, a block of woodland, or as a linear hedgerow? Different people would answer that question differently. A topographic map forces everyone to have the same answer.
I think that citizen mapping should move to focus on particular subjects and themes that the individual is interested in. And try and get closer to the underlying phenomena that is being described.
I should have qualified what I said with 'in the enterprise GIS world' or something similar. I'm talking about numerous occasions I've had organisations say they want to use OSM but then complain about the quality, as if (as someone advising them about geospatial) it's something I have control over!
Anecdotally, I've always found really up-to-date information. Worst thing was that some rural areas had not a lot of data (so I added it, at least a bit).
OSM is a really cool idea, and a fantastic dataset! For people curious about what it's like to use OSM in a huge company, a couple engineers from facebook were recently on a podcast discussing it: https://softwareengineeringdaily.com/2020/04/17/facebook-ope...
Some interesting tidbits: FB are hyper-conscious of "graffiti-edits" that can sometimes sneak into OSM, so they built a system to self-host the OSM data, which periodically and selectively merges chunks of data from upstream, using fancy algorithms to flag "suspicious" edits upstream.
I would highly recommend to use http://maptiler.com/maps, they provide easy way, how you can customize your own map. You can also host it on their cloud. Really neat.
I'm biased (I help with the development and build my latest project on top of it), but I love mimirsbrunn https://github.com/CanalTP/mimirsbrunn. Mostly because it is fast, small and relatively easy to deploy (courtesy to Rust). It uses an old (2.x) elasticsearch as backend, so that is a major downside.
This is a little tangential, but does anyone have a strong recommendation for a good ios osm based app? Strong cycling capabilities are extra appreciated.
OSM is also one of the few remaining websites on the internet that allows bring-your-own OpenID 2.0 login! I put together a tiny personal OID server (plug: https://gitlab.com/rendaw/oidle) and OSM was one of the only public places I could test it...
The last three or four backpacking trips I've gone on, I found the OSM data wasn't completely accurate or complete. In each case I went in and fixed it based on other data sources like Strava Heatmap, primary source maps or firsthand GPS traces.
OSM is good, but it's not magic. For critical uses, check a second data source!
Maps are, by definition, outdated. They will have mistakes. They will have abstractions of our messy world that go awry at some point.
OSM is in some areas by far the most accurate and up-to-date source (which is als othe reason why for several areas, both Bing maps and Apple maps use OSM as source). In others it is not.
I often give talks about OSM and always use this: "Google, for example, wants to make profit, their maps are just another canvas to put ads on. This is not bad. But it does mean, that in places (I pull open a map of Tjad where Google maps show large areas of Nothing; Tjad, is a lot of nothing -sand, really- but it is obvious Google has no interest in making that map anything good) where there is no money to be made, Google won't drive around with streetview cars, they won't buy datasets to merge, and they won't spend effort improving or validating the maps there."
Tiles can be images or vector based.
I also recommend having a look at https://openmaptiles.org/ if you want vector based tiles. You can generate your own tileset with this (output is a mbtiles SQLite file based on a open source schema; the schema describes which information go into the database and later can be shown) or download a set of tiles (because e. g. for the world it will take a lot of ressources and a long timeto generate a tileset). There are also servers (like tileservergl or tileservergl-light) that can serve the tilset - not only as vector tiles but also as graphical image tiles.
The data itself is free. Current map tiles on tile.osm.org is hosted on donated hardware, and run by volunteer sysadmins. That's not free. they reserve the right to block you for any reason at zero notice. Don't run your business or app relying on that. :)
There are plenty of companies who take OSM data and make map tiles. Those charge money or give it away for free.
OSM has only been running since 2004; tiles have never been served from d.tile.openstreetmap.org; and just because a volunteer-funded project doesn't want to provide you with unlimited free ponies/tiles doesn't mean the project is dying.
Car navigation? Locating shops? Generally no, and overtaking Google Maps is hard in this fields.
Hiking, cycling - Google Maps is not usable, at least in places that I visited, while OSM was great.
Doing anything interesting with data, like making own map, making map-based decorations (I made some laser-cut maps), research involving geodata, data analysis and anything more interesting - it requires access to data.
OpenStreetMap is allowing to do things completely out of scope of Google Maps, so in many cases it is not even competing.
But if you are individual driving car and you are not trying to eliminate Google from your life? Then Google Maps app is often superior.
Yes. not for everyone and everything, but yes. Especially if on foot or bike, but in my area it generally feels like the better map. (and in reverse I'm sure there's areas of the world where OSM is lacking in detail) I basically only use GMaps to search for businesses.
And of course I can use OSM data in many ways I can't use Google data.
I rely on Openstreetmap instead of Google Maps for a few things. Most notably, Google extremely de-emphasizes rail infrastructure (except in Japan). In addition, Openstreetmap also does a better job of laying out political boundaries for more minor polities than Google.
I've found it more useful for specific applications that I build because it's far more customizable. The map truly feels like it's part of the app and not just some mapbox or leaflet thing sitting on top of gmaps. Again, these aren't hard facts, just stylistic preferences on my part.
The ES6 is not really a dealbreaker if you're using something more traditional. I just make my own "map.js" that exposes the interfaces I need, then I transpile it and include it like anything else.
I know that sounded really complicated, but my package.json is 10 lines, that's it.
This way you can use it in a much more conventional classical way without jumping the whole project over to ES6 syntax.