|
As haptic solutions for virtual reality, NormalTouch and TextureTouch 3D haptic shape controllers are able to explore the virtual space with just your finger. If only there were some sort of reality you could touch...
|
|
|
|
|
|
I know one industry that will be all hands on for this...
|
|
|
|
|
And I fail to not think of "Demolition Man".
DURA LEX, SED LEX
GCS d--- s-/++ a- C++++ U+++ P- L- E-- W++ N++ o+ K- w+++ O? M-- V? PS+ PE- Y+ PGP t++ 5? X R++ tv-- b+ DI+++ D++ G e++>+++ h--- ++>+++ y+++* Weapons extension: ma- k++ F+2 X
If you think 'goto' is evil, try writing an Assembly program without JMP. -- TNCaver
When I was six, there were no ones and zeroes - only zeroes. And not all of them worked. -- Ravi Bhavnani
|
|
|
|
|
|
I have worked on Haptics technology for years along with virtual and augmented reality devices. The problem with true sense of touch is that there are three primary aspects that haven't been achieved and probably never will, external of the human body. The three are pressure (tactile), temperature, and muscle control. Pressure and temperature can be roughly achieved, but preventing the human body from penetrating the virtual object is impossible without some sort of external rigging that breaks the illusion. After years of research, the only possible solution is the interruption and manipulation of the electrical signals sent to the brain.
On a side note, it's the porn industry that is always willing to invest millions into private companies to get this to work. I was approached multiple times by Vivid offering pretty much a blank check to get this to work.
When you are dead, you won't even know that you are dead. It's a pain only felt by others.
Same thing when you are stupid.
modified 19-Nov-21 21:01pm.
|
|
|
|
|
Donathan.Hutchings wrote: but preventing the human body from penetrating the virtual object is impossible without some sort of external rigging that breaks the illusion. After years of research, the only possible solution is the interruption and manipulation of the electrical signals sent to the brain.
Or some kind of smart textile that gets rigid in "touched" regions forwarding the "hard" feeling to your skin, instead of messing with your nerves system. Then doing a full fitting suit, like neopren for diving.
Probably easier to get and personally... I would prefer such approach over someone getting in my brain
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
Yes but the muscle would keep moving due to inertia, while touching a surface actually interferes with the movement and it does much faster than the brain can consciously process.
DURA LEX, SED LEX
GCS d--- s-/++ a- C++++ U+++ P- L- E-- W++ N++ o+ K- w+++ O? M-- V? PS+ PE- Y+ PGP t++ 5? X R++ tv-- b+ DI+++ D++ G e++>+++ h--- ++>+++ y+++* Weapons extension: ma- k++ F+2 X
If you think 'goto' is evil, try writing an Assembly program without JMP. -- TNCaver
When I was six, there were no ones and zeroes - only zeroes. And not all of them worked. -- Ravi Bhavnani
|
|
|
|
|
That would give it a rough estimate, but also consider practicality. As far as I know, no such material exists and having to wear an entire suit to simulate touch is highly impractical. When I was working on these projects, the focus was on training. Placing the student in an environment to learn dangerous tasks without placing them in actual danger. Entertainment isn't stable enough to sustain research into this, because people get bored of it rather quickly. Other areas, such as remote surgery, are highly sought after, but the simulation of touch becomes even more difficult, because the device has to emulate the properties of human flesh for the surgeon.
When you are dead, you won't even know that you are dead. It's a pain only felt by others.
Same thing when you are stupid.
modified 19-Nov-21 21:01pm.
|
|
|
|
|
Their new hybrid system – called a Differential Neural Computer (DNC) – pairs a neural network with the vast data storage of conventional computers, and the AI is smart enough to navigate and learn from this external data bank. "We monitor many frequencies. We listen always. Came a voice, out of the babel of tongues, speaking to us."
|
|
|
|
|
A "state actor" has cut off internet access for Julian Assange, the founder of WikiLeaks, the transparency activist organization said Monday. Did you try restarting the router?
|
|
|
|
|
Sniff, sniff... I smell Clinton.
In this present crisis, government is not the solution to our problem; government is the problem. ~ Ronald Reagan
|
|
|
|
|
Mike Mullikin wrote: Sniff, sniff... I smell Clinton
Dirty boy!
I came into this game for the action, the excitement. Go anywhere, travel light, get in, get out, wherever there's trouble, a man alone. Now they got the whole country sectioned off, you can't make a move without a form.
modified 31-Aug-21 21:01pm.
|
|
|
|
|
No need for conspiracy theories, the internet is like that right across the UK
I came into this game for the action, the excitement. Go anywhere, travel light, get in, get out, wherever there's trouble, a man alone. Now they got the whole country sectioned off, you can't make a move without a form.
modified 31-Aug-21 21:01pm.
|
|
|
|
|
It’s said that learning a new language can change the way you think, by giving you a new perspective on the world. But how does that apply to your first programming language? *Now* I know who to blame
Backuuuuuuuuuusssssss!
|
|
|
|
|
Finally, a reason to hate VB6.
|
|
|
|
|
Edsger W. Dijkstra[^] said it first:
Quote: is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.
If you have an important point to make, don't try to be subtle or clever. Use a pile driver. Hit the point once. Then come back and hit it again. Then hit it a third time - a tremendous whack.
--Winston Churchill
|
|
|
|
|
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
|
|
|
|
|
I taught myself GW Basic in 1985. It was my first language. I then got exposed to dBaseIII - which I could just NOT wrap my mind around. The whole concept of Procedures was so foreign coming from a line number driven environment. I think that even today, 30 years later, I'm still a little scarred by GW Basic.
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
that's why I'm slow as a turtle!
I'd rather be phishing!
|
|
|
|
|
Technically, my first language was the numeric keyboard encoding on the HP25C, capable of 49 instruction memory. I learned a lot about code optimization.
I then moved up to the HP67, which I think offered 200 instruction memory, then the HP41C.
Somewhere in the middle I was doing BASIC programming on a teletype on a PDP-11, where I finally wrapped my head around the idea that when you tell the computer a = 5 and later say print a+1 , there's some memory somewhere that contains 5, a reference somewhere that a refers to that memory cell. That actually took a while.
When I progressed to the Commodore PET, the world opened up to hardware, registers, and 6502 assembly language. Not to mention early concepts of a BIOS.
Doing a lot of assembly programming kept teaching me skills like code optimization, DRY principles way before the acronym was invented, good documentation and variable / entry point naming skills, and most certainly, good debugging skills.
I also got introduced to parallel processing in a Pascal-like language called SAIL, which was sort of a mind-bender because line numbers and goto / jmp (in assembly) disappeared. I really had no concept of a stack yet at that point, and when, in a summer class I took at Stanford by a guy that would pick his zit scabs and eat them (I kid you not, it was and his classes were being video-taped!) I simply could not wrap my head around recursion, stack-based variables, and ended up spending my time in front of computer playing Star Trek instead of writing the quicksort algorithm. Heck, I didn't even know what question I should be asking, my programming brain was so oriented around linear-sequence programming. I'm not sure I finally grokked the concept of a stack vs. a heap until years later.
My introduction to C was one of "run away". All those ridiculous symbols and weird behaviors, like i++ vs. ++i . Really? Who invented this horribly confusing language to write and read? Pascal was my buddy by then, courtesy of Borland and $99 Turbo Pascal.
So I basically skipped C, but really like C++. It made sense -- classes, encapsulation, polymorphism. But I got "base class" wrong. It's a BASE class right? So like the base of a pyramid, it's at the bottom of the derivation hierarchy, right??? I wrote a whole app that way. Templates were amazing, they also made so much sense.
In many ways, C++ very much refined my ideas of separating code into isolated blocks of code. Still, I found that there was too much inter-dependency between classes. Re-use was a myth, not a reality. Tight coupling of code made the code monolithic. It was only by careful planning and using higher level architectures, like a publisher-subscriber pattern, that I was able begin to disentangle the monolithic nature of applications. Which also lead me down the path of loading DLL's dynamically at runtime and separating out declarative aspects of the code from imperative ones. XML didn't really exist, so I wrote my own XAML-like syntax for rendering UI's with MFC.
Other things became apparent too - the tight coupling of the data access layer (and embedded SQL statements) that were rampant at that time. Again, a custom scripting language to separate out the SQL statements from the app had a direct impact on how quickly code could be changed to deal with changing / new requirements. Everyone's jaw would drop when I could run a C++ app, change some declarative markup and SQL, and have new functionality, without compiling a line of code (heck, without even exiting the application.)
About that time, the Design Patterns book hit the streets and everyone was yammering about DP's, and I thought, wow, this is so old hat!
And also about that time, I read Vital Dust[^], which changed how I thought about programming forever. It's quite fascinating that a book about the roots of life changed my thinking about programming - something no programming book had (and has) ever done. Around that time, I discovered Code Project (was using a 64K ISDN modem at the time) and so, on May 26 2002, I took my ideas from Vital Dust and the experiences I had gained and wrote my first article, Organic Programming Environment[^]. (Years later, the concept was revisited again with HOPE[^] originally posted exactly (-1 day) 12 years later!)
And then C# appeared. The rest was history. C# has so greatly influenced my programming and thinking style that I cringe whenever I have to use another language (which usually consists nowadays of Python and Javascript.) And it continues to do so, with functional aspects, LINQ, etc.
And certainly, my history of compiled, strong typed languages has bent my thinking, such that script, duck-typed languages are, to be frank, a joke (and a bad one at that) in my opinion.
So I continue on the path of further refining the ideas of decoupling code and creating modular data-driven rather than imperative workflow-driven applications, and where concepts like dependency injection are, in my opinion, totally the wrong direction to go, (I can't believe people still use DI), and to the ire of the people that look at my code and the supporting framework, who don't get it and never will get it.
So over the last 30 years, my exposure to a certain path of languages and a certain path of problems has led me to a particularly lonely corner of the programming universe! But thank God for Code Project, where I you can at least peek under the bed to see what monsters lurk in the bedroom of my programming mind.
Marc
|
|
|
|
|
Nice explanation
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
What an enjoyable, and educational, tour of your journey as a programmer !
I'm still very influenced by my experience with LISP and PostScript, both of which, as you know, are interpreted, and allow easy transformation of text to code, and code to text (REPL). Few people today understand that PostScript was really a kind of binding of a framework that owed much to LISP to a graphics engine significantly more powerful than either of the graphics engines/libraries (QuickDraw, GDI) in the two dominant computer platforms / OS's.
I really appreciated the ease with which PostScript allowed one to save, or restore, the complete 'graphic state' in effect at any given moment ... to implement a kind of context-switching inheritance-model by having a Dictionary Stack, where any name-lookup started in the topmost Dictionary in the stack, and searched down, evaluating the first 'hit.'
I remember a meeting where John Warnock (when I was at Adobe) talked to all of us engineers in the app division (at that time PhotoShop, Illustrator, newly acquired AfterEffects), describing how productive he and other developers/researchers at PARC had been using SAIL, SmallTalk, and other in-house dev tools.
That was during the period of time when the team porting PhotoShop to Windows had decided the only way to do it was to implement the MacApp FrameWork itself, and then write/port the code to that. This was considered 'lunacy' by other engineers who predicted they'd end up having to re-write it from scratch ... and I am sure it has been re-written many times as it evolved into PS CS/CC.
For me, C Sharp is the ultimate, and I am a 'true believer' in strong-typing. I believe that one reason I like C-Sharp so much is, of course, its vibrant, massive, infrastructure of adopters, users, devs, resources like CP and StackOverflow, very large numbers of quality blogs from within MS as well as out, and, the educational efforts, and technical writing, of individuals I consider gurus-of-gurus, like Jon Skeet, Eric Lippert, Marc Gravell, Pete O'Hanlon, Marc Clifton, Sacha Barber, and many others.
Of course, it's always nice to play 'whatif:' what if MS had put its dev assets to work on an improved Windows Forms; with a retained-mode 2-d graphics engine made usable ... so that an app like Marc's FlowShape could be done relatively easy... instead of doing the whole dance with WPF, Metro, etc.
Anyway, for me, it's great to be alive and able to use C-Sharp ... even if the latest MS OS is going through 'teeting pains.' !
«There is a spectrum, from "clearly desirable behaviour," to "possibly dodgy behavior that still makes some sense," to "clearly undesirable behavior." We try to make the latter into warnings or, better, errors. But stuff that is in the middle category you don’t want to restrict unless there is a clear way to work around it.» Eric Lippert, May 14, 2008
|
|
|
|
|
I knew a prof at the University of Waterloo who said Basic, a language he taught, "cause irreparable brain damage".
"Time flies like an arrow. Fruit flies like a banana."
|
|
|
|
|
Soviet scientists tried for decades to network their nation. What stalemated them is now fracturing the global internet. If only they had Al Gore...
|
|
|
|
|