WildWestDesigns
Good points on the naming convention. We're really talking about machine learning and pattern matching. The black box concern is legit, I share that one actually. It seems like a black box already to me. No one at Anthropic or OpenAI or Google or xAI can show us a model's reason for its output. What happens when it starts deciding who gets a loan or goes to jail and we can't know why? Yikes! Biases can be trained in for sure. Hard to trust that. Not sure if that was your worry, but it's certainly mine.
I do have other concerns, I'm trying not to derail things as quickly, but I do have quite a few more.
Here's where I think the framing gets a little sideways... nobody in this thread needs AGI.
No, my mention is more about what is required for actual true "AI", versus the feel good marketing version of "AI". Not on what we would actually need.
myront is trying to get a sky image at the right pixel dimensions for a banner. Topaz Gigapixel, which runs on the same type of models you're talking about, solves that in about 10 seconds (mostly). More of a workflow question than a philosophical one.
There is a lot of things that one can automate without needing to go to other people for it (I'm the type to create in house tooling for my competitive advantage, not everyone wants to, but that would at least make sure that it isn't a "black box" and not beholding on someone else to fix what is going on). Flip side is the front end work, depending on one's abilities that can be quite a deal, but it save things in the back end if one is doing it enough to actually a make it worth automating.
I have no problem with automating for efficiency, I just don't think it always needs to be done by a 3rd party for everything. But as I get older, I tend to not to like to be beholding to others, especially as far as business goes, as much as a humanly can.
Whatever we call it, these tools change how we do anything digitally. They're making the monotonous, soul-eating work easier.
Not really. This makes the assumption that what it is producing is ready to go, because unless it is ready to go, the time that it takes to weed through some of it's decisions in it's output, it might as well have been quicker/easier to do it yourself from the get go.
Why is it that people are being required to use these goods (even the ones that are in charge of programming them) as part of their job review? Humans will try to get away with as much as they can when they can on average. If these tools were as cracked up as everyone makes them out to be, we should already be getting better results compared to what we are right now and not be having them tied to job reviews.
I don't think leaning into that means giving up critical thinking. I'd say it takes more critical thinking to use these tools well than to not use them at all. It takes domain expertise and a lot of time/patience to make anything worth while. And we'll have to compete with each other to make even better things, which requires critical thinking by default.
Initially that is the way that it typically is with tech. However, past a certain point, more and more is abstracted, so there is less of a need of knowledge. Just look at Gen Z/Alpha now, most don't really know about file trees. Don't even know where files are saved (and they expect it to always be auto saved, not realizing that Ctrl+s (or Shift+Ctrl+s) was a thing. How many of them can use CLI (and that is still the best way to resolve some problems even on
Windows and Mac).
In some areas, we are already getting the "AI pause". Which is essentially, people are waiting for that fancy auto correct to happen (which is in a delay of seconds, which adds up and keeps one from getting in "the zone" such as it is) and some even said that they knew what needed to be done, but started to instinctively wait until the "AI" de jour did it's thing. That's not 3 or 4 generations of users down the line, that's right now. People are using it for summaries of whatever they are looking at, I certainly wouldn't trust that, I have seen it give wrong answers until called out on (especially when it comes to code repos).
Let's not forget, in school, we have students (that aren't at that stage of knowing better) using "AI" and we have teachers (some of which are barely out of their own schooling and have only known schooling) that are using "AI" to see if students are using "AI" with mixed results. But the fact that we have students already using it as the main means, does not give me much hope, especially considering the degradation that is already being reported from people that actually do know better. Going as heavy into tech as we have in school in general has had a negative impact, imagine what it's going to be like with "AI"? There was a study done not that long ago that showed that kids learn better using analog methods versus digital (handwriting even cursive has been linked to even better reading skills as well as there is a connection with reading while writing things down and with memory, which is why all those study tips of writing flash cards etc for studying, the actual act of writing also had a thing to do with it, something that isn't achieved in the same manner as typing it).
The creativity and knowledge loss is a thing, but I think the bigger risk is with the shops that opt out entirely and find themselves competing against shops that didn't. It's horse and buggy vs automobile.
Not quite. What would be closer to your analogy is pen/
paper versus
computer/Photoshop. There still has to be user input. This is along the lines of making us the client and the "AI" is the artist. Typing a prompt based on certain specs, is not all that different compared to getting an email from a client with the brief (not just email, but I mention that as that is also mainly a text based mode of communication). It's shifting that process over. Going from pen/
paper to
computer/photoshop did not change that dynamic. Just like going from horse/buggy to automobile. The rider is still mainly in control. Using "AI", the artist doesn't have that same level of control.
Even if one totally withdrew from using "AI", very rarely do things truly die. Market share may go way done (even with horse/buggy), but things have rarely killed off the predecessors totally. I go to my did radio kill the stage, did movies kill radio, did tv kill movies? They are all still around, none are totally dead. I can go even further, did CGI kill 2D animation (just had a 2D short drawn by one former Disney animator, 3 yrs 11k drawings win an award)? Did CGI kill stopmotion (I think Netflix has one stopmotion show coming this year and the last movie that I'm aware of was from 2022)? Now, it's not like Harryhausen's day, but it's still going.