How can I read a csv that has a size of 20 GB?

I need to read a CSV piecemeal to vectorize. The CSV is approximately 20 GB in size. I need to preprocess the file to remove certain columns and divide it into training data and test data. How can I do it without getting an out of memory error?

@yuniel-acosta I would do it bits at a time if possible. There shouldn’t be a need to do everything in memory all at once. Could you share what code you’re using where you’re running out of memory?