Memory error reading a 50GB file on EC2

I have a 50GB jsonl file sitting on a ec2 machine that I am trying to read from the same machine as follows :

df = pd.read_json(“my_file.jsonl”, lines=True)

But i get an error (Memory error) and doesn’t work as claimed ->

Can someone please help me with this ?

Traceback (most recent call last):
File “”, line 1, in
File “/home/ubuntu/.virtualenvs/eval/lib/python3.6/site-packages/modin/pandas/io.py”, line 143, in read_json
return DataFrame(query_compiler=BaseFactory.read_json(**kwargs))
File “/home/ubuntu/.virtualenvs/eval/lib/python3.6/site-packages/modin/data_management/factories.py”, line 60, in read_json
return cls._determine_engine()._read_json(**kwargs)
File “/home/ubuntu/.virtualenvs/eval/lib/python3.6/site-packages/modin/data_management/factories.py”, line 64, in _read_json
return cls.io_cls.read_json(**kwargs)
File “/home/ubuntu/.virtualenvs/eval/lib/python3.6/site-packages/modin/engines/base/io/text/json_reader.py”, line 48, in read
partition_id = cls.call_deploy(f, chunk_size, num_splits + 3, args)
File “/home/ubuntu/.virtualenvs/eval/lib/python3.6/site-packages/modin/engines/base/io/text/text_file_reader.py”, line 14, in call_deploy
re.subn(quotechar, b"", chunk)[1] + re.subn(quotechar, b"", line)[1]
File “/home/ubuntu/.virtualenvs/eval/lib/python3.6/re.py”, line 202, in subn
return _compile(pattern, flags).subn(repl, string, count)
MemoryError

Hi @rameshk82, welcome! How much memory is on the EC2 machine you are using?