Memory error on large Shapefile in Python -


import shapefile data = shapefile.reader("data_file.shp") shapes = data.shapes() 

my problem getting shapes shapefile reader gives me exception memoryerror when using pyshp.

the .shp file quite large, @ 1.2 gb. using ony 3% of machine's 32gb, don't understand it.

is there other approach can take? can process file in chunks in python? or use tool spilt file chinks, process each of them individually?

although haven't been able test it, pyshp should able read regardless of file size or memory limits. creating reader instance doesn't load entire file, header information.

it seems problem here used shapes() method, reads shape information memory @ once. isn't problem, files big. general rule should instead use itershapes() method reads each shape 1 one.

import shapefile data = shapefile.reader("data_file.shp") shape in data.itershapes():     # something... 

Comments

Popular posts from this blog

unity3d - Rotate an object to face an opposite direction -

angular - Is it possible to get native element for formControl? -

javascript - Why jQuery Select box change event is now working? -