- Joined
- Aug 7, 2013
- Messages
- 1,338
Hi,
(if this is in the wrong thread, please move it an appropriate section)
I was curious if someone knew the lower limit on how much "time" or "energy" it would minimally take to come up with an algorithm, where the amount of time/energy spent is proportional to the time complexity of the algorithm.
Hypothesis: It probably takes very little effort to come up with bubble sort, but probably much longer to re-invent the general number field sieve.
Clearly there is some kind of relation, but also some of it is dependent upon the agent's own awareness of mathematical / programming axioms. If I have access to all the "research" up till now, it would probably be more trivial to re-discover any known algorithm, whereas if all I know is elementary arithmetic, I probably will have a harder time coming up with algorithms that were only possible (to humans) because of advances in mathematics/cosi research.
I would be concerned with the tabula rasa--the agent's knowledge is a clean slate and only knows the most basic mathematics / rules about algorithms (must be a finite set of steps, etc.). Is it possible to quantify how long it takes the agent to re-discover an algorithm, and would this information even be useful?
Also, why are certain properties of mathematics or more complex algorithms not intuitive to the simple learner? For example, given the (high school/calculus) definitions of pi, e, and the imaginary number, i, it would be bonkers for me to see this and expect to say "oh wow that makes sense":
e ^ (pi * i) = -1
Yet it's not really a mathematical fact that was invented but it always existed because of the properties and axioms of how those constants are defined, along with the definition of the exponentiation operator (or this my opinion).
So it's also true for algorithms--in order to be efficient they have to take advantage of the structure/properties of what is being manipulated. Binary search works in logarithmic time on a sorted list due to the fact that it is sorted. This is sort of intuitive and even how people look up information in dictionaries, phone books, etc.
But something like the General Number Field Sieve? It's not immediately obvious (to me) how the properties of natural numbers facilitate an algorithm with this time complexity for factorization.
(if this is in the wrong thread, please move it an appropriate section)
I was curious if someone knew the lower limit on how much "time" or "energy" it would minimally take to come up with an algorithm, where the amount of time/energy spent is proportional to the time complexity of the algorithm.
Hypothesis: It probably takes very little effort to come up with bubble sort, but probably much longer to re-invent the general number field sieve.
Clearly there is some kind of relation, but also some of it is dependent upon the agent's own awareness of mathematical / programming axioms. If I have access to all the "research" up till now, it would probably be more trivial to re-discover any known algorithm, whereas if all I know is elementary arithmetic, I probably will have a harder time coming up with algorithms that were only possible (to humans) because of advances in mathematics/cosi research.
I would be concerned with the tabula rasa--the agent's knowledge is a clean slate and only knows the most basic mathematics / rules about algorithms (must be a finite set of steps, etc.). Is it possible to quantify how long it takes the agent to re-discover an algorithm, and would this information even be useful?
Also, why are certain properties of mathematics or more complex algorithms not intuitive to the simple learner? For example, given the (high school/calculus) definitions of pi, e, and the imaginary number, i, it would be bonkers for me to see this and expect to say "oh wow that makes sense":
e ^ (pi * i) = -1
Yet it's not really a mathematical fact that was invented but it always existed because of the properties and axioms of how those constants are defined, along with the definition of the exponentiation operator (or this my opinion).
So it's also true for algorithms--in order to be efficient they have to take advantage of the structure/properties of what is being manipulated. Binary search works in logarithmic time on a sorted list due to the fact that it is sorted. This is sort of intuitive and even how people look up information in dictionaries, phone books, etc.
But something like the General Number Field Sieve? It's not immediately obvious (to me) how the properties of natural numbers facilitate an algorithm with this time complexity for factorization.