The Lounge is rated PG. If you're about to post something you wouldn't want your
kid sister to read then don't post it. No flame wars, no abusive conduct, no programming
questions and please don't post ads.
I am still happily using my Blackberry. It and it's predecessors have never failed me and the touch screen it doesn't have can be cleaned without making long distance phone calls or deleting all my notes and contacts - you really have to watch what you swipe on an iCrap! I also have a couple of hundred books loaded on it that I am working my way though - very easy to read on the high-enough definition screen. I also use a wonderful (free) app called Google Maps that unlike that fruity version actually seems to know where it is and how to get elsewhere accurately.
I use Tether or a USB to connect to the internet via my PC (Bluetooth definitely is switched OFF) or simply use my 4G through AT&T - which is VERY fast!
My plan gives me unlimited everything (grandfathered in) for one fixed fee.
I love my Blackberry and drink (imported) Ribena (at $12 per litre - concentrated though) to celebrate it every day!
- Life in the fast lane is only fun if you live in a country with no speed limits.
- Of all the things I have lost, it is my mind that I miss the most.
- I vaguely remember having a good memory...
When I delete a file from within an application (say, Visual Studio) rather than from Explorer, wouldn't it make more sense for that file to be sent to the recycle bin (rather than be deleted forever)?
I'd say this should be the default behavior, but there are instances where that wouldn't be desired (e.g., an application that continuously creates temp files then deletes them when done with some processing... that'd fill up the recycle bin pretty fast). Maybe a second recycle bin for application-initiated deletes would work (one with its own capacity and "overflow" policy)?
I know we can explicitly code for files to be sent to the recycle bin, but it's almost never done, so I'm thinking doing this by default would be the ideal solution. Perhaps they'll add this to Windows 9 or .Net Framework 5.
Perhaps they'll add this to Windows 9 or .Net Framework 5.
And if they added this to .Net Framework 5, someone will post:
I know we can explicitly code for files to be deleted permanently, but it's almost never done, so I'm thinking doing this by default would be the ideal solution. Perhaps they'll add this to Windows 10 or .Net Framework 5.5.
So, I think it's the best idea if a programmer is required to pass a moveToRecycleBin boolean or something like that.
Whilst hanging innumerable Christmas lights at my lady's house this weekend (and last) it occurred to me that there is nothing stopping us from creating wire strings that contain individually-addressable lights LEDs, with a wireless controller at the power plug. IPv6 makes a virtually unlimited number of addresses available, so it's quite feasible from an addressing standpoint. Think of the fun you could have writing programs to animate the yard display, without having the deal with a bunch of electromechanical relays and such that are traditionally employed by the lunatics who make an annoying, animated display of their yards every year. Not only would it save electricity, but it would also open up a whole new field of application programming for you all!
So, why isn't it already being done? What's holding us back?
Cool! But still a bit pricey for everyday use... It's also rather clunky, what with a full circuit board and IC controller for each LED. I very much like the fact that the color is programmable, though. What I had in mind was a bit more mundane - a bundle of AWG28 or smaller wires, each terminating at a LED and connected to a common ground wire, with each driven by an addressable demultiplexer chip in the power block. Each strand would have its own dedicated network IP address and wireless interface, and the individual LEDs would respond to a subaddress in the network.
Very trivially - by running conductors from a control assembly to the LEDs. It wouldn't take a large wire, and a bundle of very tiny wires would handle a large number of LEDs per string. Address decoding would be handled in a controller located at the ac mains connection. The demultiplexing could be 1-of-n or possibly m-of-n for more simultaneous activations, all of which can be done very easily.
At this point it sounds like you've changed your design from one microcontroller per bulb to putting each bulb on a separate power loop. Replacing a 2/3 strand cable with an N strand (N= bulb count) will result in an unwieldy cable and just shift where the excess cost is.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
I did find a lot of information on doing that with Arduino. There are some lights that have the 1-wire-per-LED feature, and just connecting one or more of those to an Arduino board would work quite nicely.
Sorta Off Topic, but my favorite website for electronic stuff has to be All Electronics[^]
Yup, it's a cool site! I think something along the lines of an Arduino would work nicely, using it to address a demultiplexer, or even a crosspoint switch configuration to pulse individual LEDs. I don't know what currents modern LEDs require, but even at the 20 mA level the originals consumed, each LED would be adequately served by a #28 to #32 conductor, and 50 of these would bundle into a very convenient cable size for stringing in trees.
Roger, I looked into this a while back as a project for my Arduino. It's basically cost that stopped me. I wanted to create programmable/controllable LED lightboxes (1000+ leds) for video and stage lighting in which each LED could be color programmed.
My own lack of capital prevented me, maybe kickstarter?
That would be ambitious, certainly. But costs are going down, and it's only a matter of time before your idea will be affordable, I think. Some of the color-programmable solutions mentioned in this thread take the approach of allowing 8-bit color values to each LED, with each having its own controller chip. Another approach is to use a group of 3 LEDs - R-G-B - and apply varying pulse widths to each in order to create different colors. Persistence in human optical perception makes a number of shortcuts possible.
Start out small - maybe a couple hundred cells - and build your idea from there. My first digital alarm clock was built from TTL flip-flops and discrete LEDs because I couldn't afford fancy clock ICs that cost an arm and a leg. In the end, mine turned out to be much cooler, because the display was straight binary, not the mundane, boring decimal digits everyone has now. Heck, I think mine was even more effective, simply because it took mental effort just to decode what time it was when the thing fired off in the morning.