Python - Import txt in a sequential pattern -
in directory have say, 30 txt files each containing 2 columns of numbers 6000 numbers in each column. want import first 3 txt files, process data gives me desired output, want move onto next 3 txt files.
the directory looks like:
file0a
file0b
file0c
file1a
file1b
file1c ... , on.
i don't want import all of txt files simultaneously, want import first 3, process data, next 3 , forth. thinking of making dictionary - though have feeling might involve writing each file name in dictionary, take far long.
edit:
for interested, think have come work around. feedback appreciated, since i'm not sure if quickest way things or pythonic.
import glob def chunks(l,n): in xrange(0,len(l),n): yield l[i:i+n] data = [] txt_files = glob.iglob("./*.txt") data in txt_files: d = np.loadtxt(data, dtype = np.float64) data.append(d) data_raw_all = list(chunks(data,3))
here list 'data' of text files directory, , 'data_raw_all' uses function 'chunks' group elements in 'data' sets of 3. way can selecting 1 element in data_raw_all selects corresponding 3 text files in directory.
first of all, have nothing original include here , not want claim credit @ because comes python cookbook 3rd ed , wonderful presentation on generators david beazley (one of co-authors of aforementioned cookbook). however, think might benefit examples given in slideshow on generators.
what beazley chain bunch of generators in order following:
- yields filenames matching given filename pattern.
- yields open file objects sequence of filenames.
- concatenates sequence of generators single sequence
- greps series of lines match regex pattern
all of these code examples located here. beauty of method chained generators chew next
pieces of information: don't load all files memory in order process all data. it's nice solution.
anyway, if read through slideshow, believe give blueprint want do: have change information seeking.
in short, check out slideshow linked above , follow along , should provide blueprint solving problem.