Today, while working with csv files i got into fantastic situation where i have a list of million values and i make iteration on that. So i thought, it would be easy for me if i split list into number of pieces which dont affect my code, memory and CPU. I can also use generator expression to make make my code run faster, but i was very curious to write code to split list into number of pieces(uses little of generator exp).
And the result is here -
import os,sys
def split_seq(seq, num_pieces):
""" split a list into pieces passed as param """
start = 0
for i in xrange(num_pieces):
stop = start + len(seq[i::num_pieces])
yield seq[start:stop]
start = stop
seq = [i for i in range(100)] ## define your list here
num_of_pieces = 3
for seq in split_seq(seq, num_of_pieces):
print len(seq), '-> ',seq
Friday, September 10, 2010
Friday, September 3, 2010
Dynamicaly open file
Recently i had gone through a situation like to split a 40GB csv file for further processing, into 60 pieces having name/content to be decided dynamically based on a id present in that csv file. That really make me to write some little code to open/write/close file dynamically. I wrote following code to achieve this.
import os, sys
a=range(10)
for each in a:
s = "fl_%s = open('%s','a')" % (each,each)
exec s
exec "fl_%s.write('%s')" % (each,each)
com = "fl_%s.close()" % (each)
exec com
# can also check if file is closed or open by
# com = "fl_%s.closed" % (each)
# bool(com) #return true if file is closed else false
import os, sys
a=range(10)
for each in a:
s = "fl_%s = open('%s','a')" % (each,each)
exec s
exec "fl_%s.write('%s')" % (each,each)
com = "fl_%s.close()" % (each)
exec com
# can also check if file is closed or open by
# com = "fl_%s.closed" % (each)
# bool(com) #return true if file is closed else false
Subscribe to:
Posts (Atom)