|
Wordle 803 4/6*
β¬β¬π¨β¬β¬
β¬π©π©β¬π©
π©π©π©β¬π©
π©π©π©π©π©
|
|
|
|
|
Wordle 803 5/6
β¬π¨π©π¨β¬
π¨π¨π©β¬β¬
β¬β¬π©π©π©
β¬π©π©π©π©
π©π©π©π©π©
|
|
|
|
|
Wordle 803 4/6
β¬β¬β¬β¬π©
β¬π©π©β¬π©
π¨π©π©β¬π©
π©π©π©π©π©
Looks like you and I were kinda trying the same words.
|
|
|
|
|
Noteworthy in stale ingredients? (7)
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
holy man around foreigner
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
modified 30-Aug-23 4:24am.
|
|
|
|
|
SALIENT = Noteworthy
anagram of in stale (ingredients anagram indicator)
or
Holy man = S T
Around foreigner ALIEN But I got it from the original clue...
|
|
|
|
|
YAUThursday!
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
I'm writing some examples for using my UIX library.
It's unexpectedly challenging. I've created real world apps using it in less time than this.
The difficult bit is making it not feel too contrived while at the same time making the code simple enough to understand.
It's not writing simple code that's difficult, it's coming up with a simple application, that's nevertheless complicated enough to illustrate core concepts.
It's so much easier when I have a direct purpose to my application, rather than dealing in the meta and writing it for its own sake. It makes me feel rudderless. Like, what is it even supposed to do other than exist?
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
I am reminded of the early 90s when at least half of the apps available for Windows (3) were screen savers.
|
|
|
|
|
Had similar experience, so ended up using mostly real-world cases (names changed to protect the innocent), but they were not optimal. Hard I agree.
"A little time, a little trouble, your better day"
Badfinger
modified 30-Aug-23 0:53am.
|
|
|
|
|
honey the codewitch wrote: coming up with a simple application
Maybe you should ask your husband for this. Unless he was also involved in the development activity.
The more distant he is from the code development, the simpler will be the applications suggested by him.
modified 29-Aug-23 23:49pm.
|
|
|
|
|
What features would you like to show in the example application?
|
|
|
|
|
Basically I just want to run the entire API through it's paces. Showing each control, maybe how to build a custom control, handling multiple screens, etc.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Keep in mind that most API documentation never presents anything anywhere close to a working "application"; they're just snippets that illustrate how to provide input parameters and look at the output (if they bother). And no error-checking.
I'm not saying this is what you should strive for...I'm saying your samples shouldn't necessarily include much code that doesn't deal with invoking your APIs, and leave it at that.
|
|
|
|
|
Yeah. That's actually the challenging part, and why I feel directionless about it. Coding for its own sake, kind of. Going through some motions without a clear set of application requirements, other than the code must be simple and demonstrate the featureset.
Like, I put a push_button on the screen, but clicking it does what?
It's all kinds of little niggling details like that. It's death by a thousand cuts.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Push the button and it disappears. Then two more buttons appear...
- I would love to change the world, but they wonβt give me the source code.
|
|
|
|
|
Then two more buttons appear...
...In different locations...
Sincerely,
-Mark
mamiller@mhemail.org
|
|
|
|
|
Do you have tests? You could use those as a base for the API snippet kind of thing. Or I have no idea what I'm talking about.
Iβve given up trying to be calm. However, I am open to feeling slightly less agitated.
Iβm begging you for the benefit of everyone, donβt be STUPID.
|
|
|
|
|
I don't yet. I should do TDD, I know, but my GFX lib is nearly impossible to unit test due to the surface area, and so when I built UIX on top of it, I just ad hoc tested it the way I did GFX. I *do* have *some* unit tests for GFX now, but not a lot of code coverage. On the bright side they're ran before the lib is published. For UIX I don't even have CI/CD going yet. That's next, then I'll write some unit tests using the PIO and Unity test framework.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
honey the codewitch wrote: Like, I put a push_button on the screen, but clicking it does what? It takes you to the next control example screen. If you have a text input, it won't take you to the next screen unless the user enters whatever text you select. Each screen's completion just takes you to the next control. That's it.
|
|
|
|
|
David O'Neil wrote: If you have a text input
The device I'm targeting for the example has no keyboard, no USB input, and a 320x240 touchscreen. I admit I chuckled at the thought.
I finally figured it out. It just took a lot longer than the code would suggest.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
So make an onscreen keyboard that would obviously make use of scrollers? That code should be helpful to users trying to learn your system.
|
|
|
|
|
320x240 isn't really realistic for that. Devices this small typically don't have text input. They usually receive information like that via a cloud or bluetooth connection.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
honey the codewitch wrote: The device I'm targeting for the example has no keyboard, no USB input, and a 320x240 touchscreen
So, if it's a common user control library you're trying to demonstrate (sorry if I've misread the thread), then David's idea makes a lot of sense, draw a control, attach a simple Hello World callback, just to demonstrate how to respond to it being used, and have a Next button to move to the page which shows another control.
If it's all touch, with no room to display a virtual keyboard...then it all depends on what sort of user input the control is supposed to provide. But having your sample code show how to (a) draw the control and (b) respond to events that it raises, should pretty much cover it, IMO.
An app that shows controls for the sake of showing controls sometimes is exactly what you need, as opposed to a real-world app that attempts to solve real-world problems. At least that would be my expectation.
|
|
|
|
|
honey the codewitch wrote: I'm writing some examples for using my UIX library 1. Provide reference documentation for the library.
2. Identify common tasks or objectives of your library's users. What are they using it for?
3. Write correct, fairly minimal sequences of library calls required to perform each of those tasks. Work hard at keeping frills and options out the examples, except as "You can also do this:" sections at the end of each one.
I don't think the examples in the documentation need to be complete applications, since that can make it be easy for your user to get lost in the forest. To guarantee that they work correctly, you'll need to make a fairly complete application around them to ensure your sequences work as expected, but that application doesn't need to be in the example documentation (you can certainly include it in the distribution for the library).
Oh, and by the way: include lots and lots of screen captures of what they should see, preferably from live examples.
Software Zen: delete this;
modified 30-Aug-23 12:39pm.
|
|
|
|