Here's the completed code implementing your requirements:
Explanation:
read_large_file(file_object):- Reads a file line by line using
.readline(). - Uses a generator (
yield) to return lines one at a time. - Stops when it reaches the end of the file.
with open('world_dev_ind.csv') as file::- Opens the file using a context manager to ensure proper handling.
- Creating a generator object:
- Calls
read_large_file(file)to get an iterator. - Printing the first three lines:
- Uses
next(gen_file)to fetch the first three lines lazily.
This approach efficiently handles large files without loading them entirely into memory. 🚀
4o