There are people (apparently, @Graptemys) who edit on a very large scale, âbump intoâ very large data like these, and find them to be a âbeastâ for their machine, causing difficulty. Thatâs a place we can all agree happened. Whether simplification is a solution or the best solution is where we donât agree.
While itâs true for most OSM human editors, (I keep saying âhumanâ before âeditorâ to distinguish a âsoftwareâ editor, like JOSM or iD), not everybody who edits OSM is a âlocal activityâ (human) editor. SOMEbody had to enter these data, and others will likely need to maintain them as they âbump intoâ them (e.g. splitting a way to become part of another boundary).
I suppose what this comes down to is âwhen is big TOO big?â and âwhen is âroughâ (or ragged, or coarseâŠ) TOO rough?â As Iâve said, OSM doesnât have (strictly defined, strictly enforced) data quality standards. Though, this thread seems to offer significant voices asking the question, âWell, should we?â
I might imagine (as they already exist) good data practices like âno more than 1800 (1801? 1900? 2000?) nodes in a wayâ as it may be that somebody a while ago found good numbers like that to use so as to minimize our software choking on data bigger than those (in âtypicalâ hardware environments). But as to precision, or âhugging a coastline within so many centimetersâ or âwith no more than 0.5 meters (or 1, or 2.5 or 3.0âŠ) of distance between nodesâŠâ OSM has no such rules, at least any that are hard-and-fast or enforceable. But that also means somebody could clean them up (reduce nodes, as @Graptemys originally suggested back in the initial post), as long as ânot too much precision is thrown away,â without really offending OSM. But look: what happened is this thread, a solicitation to our community of âwhat should we do here?â I donât think the suggestion that we simplify existing data is outrageous, in fact, it is quite reasonable in my opinion. What I have found more fascinating than the technical results (unfortunately so far, wholly undetermined) is the seeming difficulty the community has in concluding what we might do about such âbeasts.â (The word âbeastâ has been used before in other Discourse threads about such data, as they exist in many places in our map).
So: it seems understood that âif you are going to ride the ride, you really should be âat least thisâ tallâ (lots of RAM helps, even knowing not everybody has gobs of RAM). I think this is what @dieterdreist means by âdedicated tools.â But what we donât seem to have addressed is âwhat if I find one of these and it is seriously over-noded in my opinion, can I âclean it upâ (using something like JOSMâs Simplify Way)? Must I consult with the community first?â I realize this is always a good idea, but look where it got @Graptemys : a very long thread, and not a lot of consensus.