|
Ah, I see your point. So if I get it right, you're doing a heap crap lot of grunt work, GUI boilerplate not being any more enjoyable, than any sort of boilerplate.
That's why I didn't get your point first, my product is a Windows executable and Windows is pretty good at taking grunt boilerplate off my shoulders.
|
|
|
|
|
I've done some UI work on an arduino - note the lack of the G but it's still a UI. Rotary encoders, buttons, joysticks, it's still all UI. I totally get the "how can I automate this?"
|
|
|
|
|
Yes, I enjoy it, but then I work in plain vanilla php, building things from scratch, so there is plenty of creativity involved.
Most often, working with legacy code, I must just copy the ui of other pages in the code, but occasionally I do get to build something new and I enjoy figuring out the CSS to make it work (there is a rule around here, soon to change, thank God, that we only use javascript when we also have a noscript option). Even though I know that I'm often reinventing the wheel, I find it a fun little diversion. I also enjoyed deriving physics formulas from scratch in my college classes, so that's a clue
|
|
|
|
|
I liked the days of the old Winforms. I don't mind HTML, but super dislike XAML and scaffolds like React. I can work in them, but they are not fun to me and I find myself delaying building them. Note I have not tried the new Winforms yet.
|
|
|
|
|
Been a full-stack developer for a WinForms app for 15+ years. I actually enjoy the UI part as much as the rest (code, DB design/development). I am very particular about my UI work and have established a number of standards that I consistently follow. I make sure the other devs (who are much, much newer to the project than me) follow the same standards. They are not overly happen about that...
|
|
|
|
|
I don't like bizdev and I try to avoid it. I don't look down on it, as we need folks to do it, and it is legitimate dev work, but I just do not like it, Sam, I am.
I'm happy for you that you do.
These days I mostly do hardware and software for little smart gadgets. It reminds me of coding back in the 80s when I learned, and when every byte counted. Getting true type fonts rendering on a system with less than 300kb of ram felt like a huge accomplishment.
Real programmers use butterflies
|
|
|
|
|
I do.
Users usually deliver their UI design, we have to collect the spec from it. UI driven design is as old as the programmer-only degree.
It is so crucial that they don't leave it to the IT to figure it out.
|
|
|
|
|
Actually I love to make those UI screens.
I try to make things accessible and simple for users. It's very hard to make a simple and useful interface.
|
|
|
|
|
I feel you! I LOVE creating the tools and I LOVE the satisfaction of customers using my tools, but actually using my tools myself is BORING. Check out my toy...
BuilderHMI[^]
|
|
|
|
|
It's rare for me to post here in the lounge but I'm curious about this.
So I'm buying a new rig, after 10+ years on my current one. A new Dell Precision 5820 with 2x more cpu cores so I can build bigger things faster. And I'm trying to decide if I should spend more money on a RTX A4000 over the RTX 4000. The RTX A4000 is 2x more in terms of everything for $300 USD more.
I'm wondering if anybody here is doing any GPU programming, or writing code to use the GPU along with the CPU, taking advantage of CUDA cores and parallel processing. Currently this RTX A4000 would be overkill for me, but if I can use it to do something cool with all those GPUs on the card, then I would pull the trigger and get the better card.
I hear all this talk about Deep Learning and AI, plus BitCoin mining and wonder if that's something within reach of dabbling with, if I have a cool video card. Sometimes I wonder if I should be mining Bitcoin while I write code during the day. Being able to get just one coin would be cool at today's rate.
RTX A4000 Graphics Card | NVIDIA
Quadro RTX 4000 Graphics Card | NVIDIA Quadro
If it ain't broke don't fix it
Discover my world at jkirkerx.com
|
|
|
|
|
No idea, but my kid keeps complaining that the high-end cards just aren't available.
So look into what is actually available.
|
|
|
|
|
Oh I get one no problem, already confirmed that. If I book it today, it would be Jan 20 2022 delivery for the whole system. It's getting the card alone that is hard, but Dell sets aside cards for complete system builds as a priority. And the price wasn't that jacked up either.
If it ain't broke don't fix it
Discover my world at jkirkerx.com
|
|
|
|
|
After 5 weeks, everything becomes a habit. I expect that at the end of that time, you'll want more rigs ... for mining. Like having a chinchilla farm.
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food
|
|
|
|
|
I used to write a lot of GPU shaders for graphics, but if there is anything that has an even greater hunger for GPU processing power, then it is AI. The shaders for calculating a neuron may be relatively simple, but there is no limit to how many neurons you may want to use. That said, for both graphics and AI, the question always is if you are working on a level that justifies the price, otherwise it's more economic to use the best of the last generation and upgrade later when the now best of the best also has taken the back seat to something new.
Bitcoin mining also is no argument. The time where bitcoins were easily found is over. Finding more always takes more time than the previous solutions. You can throw more processing power at the problem, but you will suffer the law of diminishing returns. The only people who tell you otherwise are those who intend to pull as much money as possible out of your pockets before it does not even justify the energy costs of a computer running 24/7 anymore. So, if you are mining just for fun, don't invest much in it and just take whatever you get with whatever you have.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
I do a LOT of programming for the GPU. I've been working on a project for about two years now where I adapted a large amount of code to work on both the GPU or the CPU. With the speed and core-count of AMD's processors these days we have found that the CPU version is just as fast as the GPU version. It's been a lot of fun and I've learned a ton of stuff. I also dabble with GPU stuff for my side projects which tend to be graphics-oriented and it's great for that stuff too. My avatar image is an example. One caveat about my statement on performance: comparisons were done between my home system and systems with Threadrippers (32 cores) and EPYC CPUs. My GPU is a 3090 and it's pretty fast. We have more systems on order with EPYCs and A100s so I will have some more to compare with, someday. We were supposed to get one of those in September but they are no longer even giving us a delivery date now because EPYC CPUs are in very short supply.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
I specked out an Intel W-2255 with 10 cores, up from my Intel 6 core Xeon with NVIDIA K-4000 that I have now.
So your sort of saying that I can dabble with a RTX-A4000 and a 10 core CPU and do some neat things on a much lower level of hardware than yours, but still have the capabilities that your working with? That 3090 is pretty strong or stout by 2x, and I just looked up the Threadripper(32) and WOW! That's like $4200 just for the CPU and card. And I'm looking at $3433 total and scratching my head on spending another $308 for the card.
That's feedback I was looking for.
Thanks!
If it ain't broke don't fix it
Discover my world at jkirkerx.com
|
|
|
|
|
Just to clarify, the Threadripper and EPYC are CPUs on work machines. We have a couple of Titan RTXs for them but the 3090 is faster with twice the memory and twice the cores. My home machine has a Ryzen 5900 in it with twelve cores. I essentially bought a video card and they threw a computer in with it. Obviously, I am a big AMD fan. That is, of their CPUs. Their GPUs are OK but I use CUDA and they are not compatible with it, unfortunately.
Best of luck with your efforts.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
Nice card. Too bad the styling of it is so boring.
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
|
I figure it's like building a car online, and it you want that option it cost more up front so I don't have to buy it again.
But this card uses less electricity than the RTX 4000 does. I read it runs at 130 Watts with ease instead of the full 140 watts, not sure it that's really a game changer or not. But I can mine crypto with it, and that would be interesting. Plus it's newer and more powerful, and I could use the extra display ports if I get a 4th monitor.
Perhaps the real reason is because I just paid the Uline bill for cardboard boxes and poly bags which was $300 and started thinking that for $300 I got a stack of new card board boxes and 2K poly bags in return, What the heck, buy the card. I'll see this rig Jan 20, 2022, so it will be a long wait.
If it ain't broke don't fix it
Discover my world at jkirkerx.com
|
|
|
|
|
5820 is that a mobile workstation?
Mine is a 5560 with a RTX A2000. Decided for the A model, mostly because lower power consumption and more processing power.
Dell has greatly improved the cooling system and this thing is pretty quiet, on my previous machine M4800 I could hear the fans turn on whenever Msft would push updates.
Word of warning, on the 5560 the bluetooth is on the main board. On this machine there was a 2-4 sec delay (keyboard and mouse) after a few seconds of inactivity. That was power saving functionality on the new BT version.
Made using mouse and keyboard a RPITA.
Dell has twice attempted to fix this with new motherboards, failed both times. Then just when I was getting ready to return it, a new driver became available that fixed it.
|
|
|
|
|
The 5820 is the Dell Precision Tower. Precision 5820 High Performance Tower Desktop Workstation | Dell USA
I have the older version of the Dell Precision 5560 Xeon Notebook with the touchscreen that I use for dabbling, and for client presentations. I think it's a 5520, but has the older Quadro M something. Dell Precision 5560 AI-optimized Mobile Workstation Laptop | Dell USA
After I bought the notebook in Nov of 2019, I really wanted the workstation for the office and my friends kept talking me out of it. I had a problem with flash drives burning up, and flash drives corrupting data when copying to the NVMe drive on the notebook. I really like the notebook and it's pretty slick.
But now it's time to replace my office workstation. Actually my current Precision T3600 (2011) works so well, that I forgot it was on the list and was going to buy a 100 oz bar of silver or a 1 oz gold Gold American eagle this week. I'm still sitting on my stimulus money and haven't spent any of it yet. But I'll probably just pay for it with my business money.
My friends think I should buy a gaming type of computer with an Core i9 10 core and go really fast. And they don't understand why I bought a such a boring corporate type of computer. But the Dell Precision trim just works for me and has proven to be a work horse.
In case your wondering, I work 3 jobs. I diversified my business and started selling on several market places in 2 different countries, and will add 2 more countries by March of 2022 and sort of go world wide in 4 different languages. I still write code for me to manage my business, and I write code for other companies which has really picked up over the last 2 years. Right now, sales really picked up this week, and people are Xmas shopping early. Yes I have my supply chain issues and it's getting really bad now, so I had to 4x my inventory to keep things in stock. Turns out I'm really good selling online, well I should be because I've been writing eCommerce software for 17 years now.
If it ain't broke don't fix it
Discover my world at jkirkerx.com
|
|
|
|
|
Make sure it doesn't come with onboard bluetooth.
|
|
|
|
|
|
I'm anxiously awaiting the final novel in the series. It was originally supposed to be released in October, and now it's scheduled for early December. I've had a pre-order in for it since July .
Someone posted that this is the final season, and that they're skipping an entire story arc. I can't watch the trailer at work (ze IT gestapo).
Software Zen: delete this;
|
|
|
|