|
I for one have no idea what you are talking about, but what the heck!
Didn't know your account wasn't OK ... did the special cream help?
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
It always fascinates me how many people seem to think they are paying for (and can thus demand) a special service here.
|
|
|
|
|
"Don't tell me to read the documentation, just do the research for me!"
"Why won't anyone answer my questions?"
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Humans are terrible at communication. We fill in too many blanks with our own experience and are often wrong.
Just ignore those that do not help and make sure you are providing enough information for someone who can't see what you are doing to help you.
|
|
|
|
|
If anyone can explain to me the logic behind Microsoft's boilerplate scaffolded Identity management code for ASP.NET apps conflating username and email so you are (a) confused as to what's happening, and then (b) hitting trip wires after you unconflate them, then I'd love to hear it.
If you're going to provide Best Practices boilerplate, then please make them, at the very least, Pretty Good Practices.
cheers
Chris Maunder
|
|
|
|
|
Is it like the problem described here[^]?
Quote: The UserPrincipalName and Primary email address are based on different attributes from local AD. The UserPrincipalName is generated based on the User logon name in ADUC, while the Primary email address is based on the proxyAddresses. In this situation, we suggest you check the user's proxyAddresses in ADSI Edit and update it with the correct domain, then Force directory synchronization.
It seems really confusing. I'm not sure why it can't be simpler. But I don't understand all the super important stuff about Identity.
|
|
|
|
|
Active directory integration is a great way to make a simple thing complicated.
Real programmers use butterflies
|
|
|
|
|
No. They simply use "Email" to set "Username". So then everything becomes weird. When logging in you need to provide a username. But it's actually your email. But if you've overridden things to allow an actual username, then if the user enters their username it won't work, since it's after the email. Or Worse, you provide the email, but you've allowed (hacked) the code to have usernames, then when you pass in the email for login, it's actually wanting the username.
Head asplode.
I get what they are trying to do. But if honestly feels like someone put together the minimum viable code, capped off some of the bits that were too hard (or they ran out of time) and they just wrapped it up and shipped it. It's totally not out-of-the-box useable in the Real World. It's classic demoware.
cheers
Chris Maunder
|
|
|
|
|
What baffled me the most is that all the text in that boilerplate code is hard-coded.
You're either English (and only English) or you're forced to scaffold each and every page and replace it all with your own text or resx files.
I haven't hard-coded text for the past decade or so.
Also, all those Identity classes are Something<TIdentity>, but the UserManager<IdentityUser<TIdentity>> always returns a string for ID (and isn't based on an interface so you can't implement your own!?)
Those seem like rookie mistakes to me
|
|
|
|
|
|
cheers
Chris Maunder
|
|
|
|
|
When I encounter code that does Something Cool(TM) at least half the time it's not readable. The more unreadable it is the more I want to make that magic mine.
And yet, when my own code that does Something Cool(TM) gets unreadable I consider it a personal failing of mine.
It's not very fair of me, to me, and sometimes it leaves me thinking I'm a crap developer.
That's not very realistic either, because if that was true it means my clients are all idiots, as is everyone that has ever told me what a great developer I am.
Imposter syndrome is annoying. Unreadable code brings it out in me.
On one hand: "HOW DID YOU DO IT? I MUST KNOW!" when other people do cool messy magic.
On the other: Now that I did Something Cool(TM), look at the mess I made.
I still think the best code I ever wrote was simple.
Real programmers use butterflies
|
|
|
|
|
and
I agree that simple code is far better than unreadable. The only exception might be "unreadable" code that is much shorter than the simple version would be. Comments can deal with that.
|
|
|
|
|
I have seen a lot of unreadable code. Most of the time if I want to use it I try to make it readable, at least for me. Out of all the times I have done that, there was only one instance where I found the results to be slower than the original. That was quite a while ago and compilers have come a long way since then.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
Rick York wrote: if I want to use it I try to make it readable, at least for me
"at least for me" being the operative bit of that. i know my code isn't as readable as it should be because of how many people tell me my articles lose them.
Real programmers use butterflies
|
|
|
|
|
In the past I have written code that seemed readable at the time but when I came back to it a few months later was unintelligible to me! it worked fine but how I did it, I have no idea!
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
� Forogar � wrote: In the past I have written code that seemed readable at the time but when I came back to it a few months later was unintelligible to me! it worked fine but how I did it, I have no idea! My experience is more like
In the past I have written code that seemed readable at the time but when I came back to it a few months later was unintelligible to me! It looks like it has never worked, but how I ever thought it was OK, I have no idea!
|
|
|
|
|
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
<OldFartWarStory>
I have inherited a body of code from a guy who retired about 10 years ago and then died. He was a grade A a-hole to work with while he was here. After he left his code bounced from person to person, with none of them making any changes, until it arrived on my desk. I have spent a couple hundred hours over the last year just looking at it in the course of diagnosing issues. I am currently about to make my first change in it due to a hardware obsolescence problem.
This is all C, running on Windows as a command-line app. The code was deliberately written so that A-hole was the only person who could maintain it. Identifiers are of minimal length. There are global values with single lower-case character names. While there are dozens of .C source files, none of them have matching .H files. All external references are explicitly specified everywhere. Function prototypes are not used - anywhere. Magic numbers abound. Pointer arithmetic is used unnecessarily and unwisely. There's more, but I don't want to frighten you children.
He cultivated a mythos surrounding his code, claiming that it was difficult and performance-intensive, and that was why other people couldn't handle it. His management believed it.
I have spent the last two weeks figuring out how to address this obsolescence issue. When I'm done, I will have replaced <100 lines of code with around 40.
</OldFartWarStory>
Writing unreadable code is inexcusable and should be grounds for termination with utmost prejudice.
Software Zen: delete this;
|
|
|
|
|
I've found there are cases where code is bound to be ugly no matter what. Either the interface the code must integrate with is nasty (using the unfortunately undocumented soc/spi_master.h on the ESP-IDF), or the algorithm is horrific (LALR table generation) and there's little choice, plus comments only get you so far because you'd have to explain the entire thing you're coding against (like the LALR algorithm, or you'd have to document the undocumented interface as well as your own code)
So for certain things, I forgive ugly code, as long as it is otherwise well written.
Real programmers use butterflies
|
|
|
|
|
I've been trying with no luck to read data off of a couple of different SPI devices.
It turns out, the documented API for doing SPI with the ESP-IDF is not very capable.
There is a lower level API that both the higher level API and the Arduino framework layer are built on top of. It is undocumented.
Worse, it's Apache licensed, meaning just for me to figure out how it works, I get roped into their licensing scheme because I have to look at their code and make something based on it.
It does me no good if you build layer upon layer of API and then hamstring the top layer and don't document the lower layers.
And now I get to go cave diving. Here there be dragons.
Real programmers use butterflies
|
|
|
|
|
🐉🐲🐉
Does the recent case with Oracle and Google(?) exempt you from this licensing scheme? I thought it was decided that an API couldn't be copyrighted, just its implementation. Regardless, it is bizarre for an API to infect users with an undesirable license.
|
|
|
|
|
It's only if I understand the license properly and because I'm essentially forced to create a derivative work in lieu of documentation. I'm not sure how that court case impacts this.
Real programmers use butterflies
|
|
|
|
|
You can always bit bang the SPI.
|
|
|
|
|
I'm basically resorting to something like that, but for production I need to use the hardware SPI controller w/ DMA for performance reasons. I think I can get there with my current route, even though it's not ideal.
Real programmers use butterflies
|
|
|
|