Skip to content Skip to sidebar Skip to footer

Python: Threaded Reading From Multiple Files

I have a total of 10 files (could be more at some point - but it will be a fixed number). They're small - at around 80 byte. While reading from them is all good and works - its slo

Solution 1:

Reading small files isn't slow, provided you do it in one go.

First, let's create a 80 byte test file;

dd if=/dev/random of=test1.dat bs=80 count=1

Then we define a function to read all of it;

In [1]: def readfile(name):
   ...:     with open(name) as f:
   ...:         data = f.read()
   ...:     return data
   ...: 

Then, a timing run (reading from a normal harddisk, not an SSD):

In [3]: %timeit readfile('test1.dat')
10000 loops, best of 3: 18.1 us per loop

So it takes 18 μs to read such a file. I wouldn't call that slow.

When I create 9 of these test files and read them in a loop:

In [3]: %timeit for i in xrange(1,10): readfile('test{:d}.dat'.format(i))
1000 loops, best of 3: 184 us per loop

With the loop overhead it is still only about 21 μs per file.

Edit:

Having seen your code, it seems pretty complicated for what it does. I would structure it like this:

data = []
temp = []
for sn in ['/home/pi/sensoren/sensor{:d}'.format(i) for i in range(1,11)]: 
                                                            #xrange if Python 2.x
    with open(sn) as f:
        data.append(f.read())
# the data list now contains all sensor data
for num, s in enumerate(data):
    # check for CRC
    d = s.strip()
    if d.startswith("YES"):
        t = d.split("t=")
        # populate temp list
        if t[1] == '-62':
            temp.append("00")
        else:
            temp.append(t[1])

Advantages:

  • This reads every sensor file in one go.
  • It also removes two function calls per sensor.
  • Much less typing.

Post a Comment for "Python: Threaded Reading From Multiple Files"