|
I have both Visual Studio 2019 and 2022 installed on my development machine.
For my current development efforts, which is military simulations development, I find VS 2019 with .NET Framework 4.6 works just fine.
With the little I have done with Visual Studio 2022 I have yet to see any problems. But, then again I have not used it extensively.
Since you appear to be working in the same technical areas that I am in terms of the needed technologies, I don't see any reason for you to upgrade to Visual Studio 2022 unless you want the latest features and want to work with .NET Core 6.x and above
Steve Naidamast
Sr. Software Engineer
Black Falcon Software, Inc.
blackfalconsoftware@outlook.com
|
|
|
|
|
We use VS2019 and VS2022 on VMs running on a system that is firewalled off from just about everything. For NuGet packages we download them and transfer them to a server that is inside the firewall. The NuGet packages are in a folder that is shared to the VMs. It's irritating to update VS and the NuGet packages, but it works.
|
|
|
|
|
I've not used VS2022 offline so I can't comment on whether it works fine or not.
If you want to use .Net6+ then you need to use VS2022. It is an improvement over VS2019, especially with Git.
As for Nuget, you don't need a dedicated "Nuget application server" as such, you can just use a regular folder as a Nuget package source, either on the same machine or a shared network drive. VS2019/20 is happy to find and install packages from a plain windows folder. You just wont get some of the extra meta data that a Nuget server adds.
|
|
|
|
|
I tried with all VS, complete offline system in home for learning purpose, after VS2015 all fail here and there, if requirement is just .Net framework 4.7, you can stick what is working for you.
|
|
|
|
|
Thanks to all for your experiences and opinions.
The results are:
1) VS 2022 off-line -- I'll have to experiment with VS 2022 to see how it behaves off-line for myself. The comments here have been a mixed bag, seemingly depending on which mode the individual is used to; very online folks find offline a pain whereas those not using online packages / usually offline find it better than 2019.
2) .NET vs .NET Framework -- very much based on project domain. We prefer to use stay close to current versions but ... I'll just have to see.
3) NuGet off-line -- this is pretty easy, either using GitSomething as the local server or just stashing the downloaded packages in a directory and point to that. Making sure I get all the packages I want the first time is going to be the key here.
Thanks again!
Be wary of strong drink. It can make you shoot at tax collectors - and miss.
Lazarus Long, "Time Enough For Love" by Robert A. Heinlein
|
|
|
|
|
Going for my Covid jab this morning so might not respond quickly.
A quiet meadow for definite enjoyment (8)
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
P
LEA
SURE
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
I just got back from having mine - Pfizer this time, for a change.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
I've had Pfizer every time
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
All vaccined up
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
I'm not sure how realistic this would be, given the fuzzy nature of AI, but optimization is sometimes a fuzzy game as well.
I don't really know a lot about what we call AI, but it seems to me if you could use it for code *synthesis* you could also use it to optimize program code output during compilation.
AI should be able to in essence, see higher level patterns in code, maybe for example, automatically deinlining repeated uses of code in a higher level than would otherwise be possible, performing some degree of post factoring.
It only just occurred to me today, so I haven't kicked it around a lot, but considering what google is doing with "fuzzing" and such, I don't think this is totally out of hand to consider.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Having seen some of the code that ChatGPT created and that gets posted in QA, I wouldn't trust an app optimised by it for anything important.
And "important" here means "coming within 100 feet of me or my computer"
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
What it spits out is really apalling sometimes, other times not too bad but I think you are better off googling
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
And the problem is that the people using it don't know the difference - they just hammer it into the app and walk away ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Ever since the advent of the Web it's become common (and largely acceptable) practice to cut'n paste, which is fine if you understand the code - I used to carry several, very large heavy books around for years until I knew and understood them inside out, buy more and repeat the process.
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
Meh, it's fine if what it is doing is immediately verifiable.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Maybe we can use AI to decipher some of your more technical messages on here
|
|
|
|
|
I see people here ranting about ChatGPT. I am sure your post did not refer to such a generalized tool!
I would assume that this will happen quite soon. If you would have access to searching academic papers I think you would see some [at least] theoretical advances in the field.
"If we don't change direction, we'll end up where we're going"
|
|
|
|
|
Given that what passes for AI is basically advanced pattern matching, this may be possible. It is my understanding that optimizers already do this, they just don't call it AI.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
They do it but only go so far.
C++ won't for example, "de-inline" repeated code for you.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
honey the codewitch wrote: you could also use it to optimize program code output during compilation.
Optimizing code seldom provides any measurable impact. At least not in my domain space. But might so more in yours.
In my domain spaces optimization is impacted by the following
1. Requirements (most)
2. Architecture
3. Design (explicit or implicit)
4. Technology/code (least)
Keeping in mind of course that one must be able to differentiate between those parts.
One can, and I have, achieve orders of magnitudes improvements by requesting changes to 1. But for 4 (per this suggestion) actual work seldom can achieve anything more than about 1% difference. Especially in terms of the user experience.
A profiling tool might be able to find a bottleneck but then one must analyze where the problem lies in the parts above.
|
|
|
|
|
Yeah, I'd have to see data before I wrote it off.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
You are certainly right that the first three points are essential. I take for granted that '3. Design' includes choice of algorithms and data structures (at multiple levels). (Side remark: I think that the importance of good data structures is highly undervalued in many of today's designs!) Doing these points right might lead to big-O improvements.
Yet, I think that you go to far stating that 'Optimizing code seldom provides any measurable impact', suspecting that what you think of as 'unoptimized' code is what your compiler produces by default. It must be decades since I saw a production compiler that did no sort of optimizing at all! Any common compiler today employs a big crowd of optimizing techniques. Using the result of this as a base for comparison, you might see that you 'seldom can achieve anything more than about 1% difference', because the potential has already been taken out.
When I update my computer to new technology, running exactly the same binary applications, no changes in steps 1-3, so the only change is in technology (not even including coding), I experience a lot more than 1% speedup. 'Especially in terms of the user experience', as you phrase it.
(My current main PC is of 2016 vintage, so I honestly expect a speedup the day I upgrade, otherwise I would say that technological development has failed miserably ... except for sound and video. I do not expect those to play any faster.)
|
|
|
|
|
trønderen wrote: Yet, I think that you go to far stating that 'Optimizing code seldom provides any measurable impact',
I was referring to what is normally known as 'micro optimization'.
Happens when a developer comes across a bit of code which they 'know' can be written to execute more efficiently and then they re-write it. Without ever attempting an end to end test on the application with real business messages to see what impact that specific bit of code has on the enterprise.
trønderen wrote: I experience a lot more than 1% speedup. 'Especially in terms of the user experience', as you phrase it.
I don't.
Coding
1. Most of my time is spent designing and writing code. Hardware can't speed that up.
2. Compiling. Certainly never seen a compile go from 1 hour to 1 minute regardless of hardware. If it goes from 60 minutes to 50 minutes (more than 1%) I would not even notice.
3. Debugging. Hardware can't help with that.
For some other examples at the current company the primary system is on the biggest cloud box available (multiples.) Can't go bigger. The performance problem is due to a legacy system upgraded piecemeal (not even coherently) and with no limits on how the users are allowed to use the system. Up until about 6 months ago even the public Rest API throttle did not work but even with that the company motto is just to increase the limit, at no charge, if they have a problem.
Hardware will not fix any of that.
Another example is from a different company. Performance testing with business load demonstrated that the application was as optimized as possible. That was even proven over time with production. The proven (measured) bottleneck was the third party services that the application had to interact with. No way to even fix that problem.
|
|
|
|