Function to handle large file

Here's the completed code implementing your requirements:

python
CopyEdit
# Define read_large_file()
def read_large_file(file_object):
    """A generator function to read a large file lazily."""

    # Loop indefinitely until the end of the file
    while True:

        # Read a line from the file: data
        data = file_object.readline()

        # Break if this is the end of the file
        if not data:
            break

        # Yield the line of data
        yield data

# Open a connection to the file
with open('world_dev_ind.csv') as file:

    # Create a generator object for the file: gen_file
    gen_file = read_large_file(file)

    # Print the first three lines of the file
    print(next(gen_file))
    print(next(gen_file))
    print(next(gen_file))

Explanation:

  1. read_large_file(file_object):
    • Reads a file line by line using .readline().
    • Uses a generator (yield) to return lines one at a time.
    • Stops when it reaches the end of the file.
  2. with open('world_dev_ind.csv') as file::
    • Opens the file using a context manager to ensure proper handling.
  3. Creating a generator object:
    • Calls read_large_file(file) to get an iterator.
  4. Printing the first three lines:
    • Uses next(gen_file) to fetch the first three lines lazily.

This approach efficiently handles large files without loading them entirely into memory. 🚀

4o