Python - big list issue [closed] - python

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I need to create and search a big list, for example list of 3.9594086612e+65 arrays of 10x10 (or bigger list of bigger arrays).
I need to create list of combinations. After I need to filter out some combinations according to some rule. This process should be repeated till only one combination is left.
If I try to create this list it crashes because of the memory after few minutes.
I can imagine, that solution should be to store the data in different way than list in memory.
What is possible, correct and easy way? SQL database? NoSQL database? Or just multiple text files opened and closed one after one? I need to run through this list multiple times.

Hy:
3.9594086612e+65
It's more than all computer memory in the world.
(And I do not multiply by 10)!

Related

When to use Lists, Sets, Dictionaries, or tuples in python? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I'm new to python, and I'm a bit confused about the use cases of data types in python
Can someone please explain in detail when to use each data type, with an example if possible
Thank you.
Lists are used when you have data you want to further modify, alter like sorting and all.
Dictionary is used when you have to sets of data where data of the first set corresponds to data of other set. And the position of the data doesn't matter only the relation of the two sets matters.
A tuple is used when position of the data is very important and you don't want to alter the position throughout.

Python: what is the best way to store numerical key value pairs? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I have data which consist of numerical keys and values.
I need to increase all keys and values by N number.
While i am using dictionaries for big amout of data my code works very slow.
What is the best way to store this data and the best way to increase values of pairs?
Example:
N=2
{1:4,3:6,2:1}
expected result:
{3:6,5:8,4:2}
Thanks
We can not actually do something faster if you want to change the whole data of dictionary. Even If someone run a for loop we are not sure of O(N) complexity because there can be re-hashing operations internally.
Best thing is you can smartly use one extra variable in memory for updates.
Like initially
del=0 and d={1:4,3:6,2:1}
when you want to increase values and keys by N
update del+=N
While retrieving from dictionary for key value k
Use d[k-del]+del
Best you can do about this is O(N) whichever data structure you use, you will have to visit the values of each element and increment them.

Array vs object - what's faster in Python [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I am wondering what I should do for the purpose of my project.
I am gonna operate on about 100 000 rows, every time.
what I wanted to do is to create an object "{}" and then, if I need to search for a value, just call it , for example
data['2018']['09']['Marketing']['AccountName']
the second option is to pull everyting into an array "[]" and in case I need to pull value, I will create a function to go through the array and sum numbers for specific parameters.
But don't know which method is faster.
Will be thankful if you can shed some light on this
Thanks in advance,
If performance (speed) is an issue, Python might not be the ideal choice...
Otherwise:
Might I suggest the use of a proper database, such as SQLLite (which comes shipped with Python).
And maybe SQLAlchemy as an abstraction layer. (https://docs.sqlalchemy.org/en/latest/orm/tutorial.html)
After all, they were made exactly for this kind of tasks.
If that seems overkill: Have a look at Pandas.

Read each line in a text file, store the values in a list, and compute scores [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Read each line in a text file, store the values in a list, and compute scores
The best approach this problem is to list all the functions you need to do the task. A typical example is:
Read file.
Read list.
Take list and get each string delimited by space.
Store string into an array.
...
Then go to the Python website and lookup how to do each function.
Example: To do input and output function in python:
https://docs.python.org/2/tutorial/inputoutput.html
Also, you can look up function by asking google. Google will then point you to answers to your questions:
Q: How to find mean of a list?
A: Finding the average of a list
If you do this enough times, eventually you will be able to write a problem that solves the problem listed.
Good Luck!

Which is more efficient and faster way to access an element? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Suppose i have a list of hundred natural numbers, set of hundred natural numbers and dictionary of hundred natural numbers (assuming both key and value are natural numbers). I want to access an element in these data types. Which will be the more efficient and faster way to access it? I know i can use some performance tools like timeit or cprofile etc to check the performance but how will i know which data type to choose and when?
At a high overview:
list lookups are O(n)
Where n is the length of the list.
dict lookups are O(1)
This is basic Big O notation or Complexity.

Categories