Loading a large osm file in osmar : why does loading time scale quadratically with file size?

I'm trying to load huge .osm files into R in order to run advanced analysis. I did not expect the loading part to be complex, but it proved difficult.

The issue here is that the time it takes to parse the XML file (.osm is xml-like) and populate an object (here an osmar one) seems to be quadratically proportional to the file size.

require(osmar) my_file = osmsource_file("filename") osmar_object = get_osm("bla", source = my_file) 

And in case you wonder how I found that it was quadratic, here is a representation of the analysis, with files between 0 and 500 Mo.

Loading a large osm file in osmar : why does loading time scale quadratically with file size?

At this point, my only chance to make it work seems to be the following : split the country in 100x100 small parts, and load them separately. Obviously, I don't really like the idea.

Why would it be so long ? How can I do so, fast ?

Replay

Category: openstreetmap Time: 2016-07-28 Views: 4

Related post

iOS development

Android development

Python development

JAVA development

Development language

PHP development

Ruby development

search

Front-end development

Database

development tools

Open Platform

Javascript development

.NET development

cloud computing

server

Copyright (C) avrocks.com, All Rights Reserved.

processed in 0.251 (s). 12 q(s)