I recently started doing research on a new series of articles on the paradox of parallel research and development discover in science and technology. It seems like every time I get a great idea I find that 'Wow someone else did it also...', usually this happens months or years down the road, it is rare that I find an idea which has all ready been done. There lies the paradox. How can two separate entities conceive of the same design paradigm during nearly the same interval of time?
My best guess is that as science and technology is discovered new technologies can therefore be leveraged, concluding that if an idea stems from this new leverage; if it is significantly relative, it will occur in the same epoch in multiple instances since it is relevant and defendant on further new leverage of technology. Thus the paradox of 'time being predetermined'.
Any way, I thought I should also create a thread about all the great ideas I have or had which fall into this paradox.
After doing some research on 'Big O Notation' I read some posts from some developers explaining that you have to do 'Big O Equations' in your head. Perhaps they were pulling some ones leg? I don't know. But it gave me the idea to create this tool, which can help to find the 'Big O function' for a given algorithm. The application uses C# Reflections to gain access to the target .net assembly. Infinite asymptotics and instrumentation are used to graph the function in relationship to time. When using infinite asymptotics it is necessary to create a method or property which takes an integer as input. The integer value is the infinity point, the point at which the function evaluates to an infinite quantity.
How accurate is the tool? ~99.99997% @ infinite asymptotic >= 1000!
New features are being added all the time, community support wanted!
The rant was about the fact that I was forced to design an inferior algorithm to demonstrate I understood the concept of recursion. After much debate about the interview and methods of solving the problem the interviewer asked me to solve I decided that I needed to study and research 'Big O Notation' and better understand algorithm complexity. Something I have never done because I have been gifted with the ability to write code since the age of 10 and have been writing computer programs ever since. During my Carrier as a developer I was professionally mentored by Developers that created the Java Programming Language. I never felt that I wrote inferior code after this mentoring since they showed me all the best practices for writing high performance server code.
While researching the subject of Big O Notation I cam along a thread on a MSDN, a comment was made to someone asking if a tool existed that could analyse an algorithm and return the Big O Notation for the algorithm.
One developer made the comment: 'I have to admit that I don't know anything about code analysis tools. I'd doubt they could do big-Oh analysis, it requires the kind of computer you have between your ears to pull off tricks like that...'
Having had much study in the field of statistics I knew this not to be true, and that in fact it was very possible to analyze an algorithm as it runs in time take that as a control set of data, and then run it thru a statistic to see of there is a correlation to a known linear, logarithmic, exponential, or other kind of line. I responded with a few quick demonstrations showing how easy it is to look at the numbers and see this is a possible solution. After much thinking about the problem, I developed a heuristic using Persons Rio Statistic which correctly picked the right Big O Function for a given set of functions.
What I did not expect to happen what that the research would prove that a mathematical proof for an algorithm using standard Big O polynomial analysis could show that the algorithm under test did not fit the profile. It because obvious that this is due to other factors that mathematics are not looking at such as: CPU, and compiler. While the written code might be a linear solution the engineered solution may not run in liner time, due to hardware and compilers.
So I have updated the project with this conclusion, and I have also found that this is the only kind of solution that has ever been created that can accomplish this task. I believe this technology will help mathematicians and engineers to understand how to better design their algorithms and help academics understand the 'Proof in the Pudding' concept as it relates to theory and practice. While it may not add much over all value to the field of algorithm development, it should serve as both a good way to see how close an algorithm scales and how it will perform in application verses theory.
I will continue to work on this project and hope to soon have a version hosted on the CodePlex which allows for an algorithm to be entered in real time and analysed while it is being developed. I also have plans to decompile the algorithm and do a complete mathematical proof using the tried and true 'Big O Notation Polynomial' method to give the compiled, native, and live results of the analysis. I will also be adding memory profiling for discovery of the complexity of the Big O function in terms of memory growth, stack, and heap growth of the algorithm. I believe this would be a truly useful tool for mathematicians, engineers, academics, developers, architects, and quality analysis.
If you would like to join in the process of co-developing this solution, just let me know in this thread.
I am putting together a series on developing graphics applications. This is my first official publication. I would like some input from the audience on what I should cover. The series has an outline. I would like feedback from Graphic Designers, Developers, and Architects. So far I have only submitted the first article in the series:
This is a very basic introductory article for someone with no programming experience. It was my aim to make it as simple as possible and introduce a new way to use your desktop to host your own customizable Silverlight desktop.