• I want to thank all the members that have upgraded your accounts. I truly appreciate your support of the site monetarily. Supporting the site keeps this site up and running as a lot of work daily goes on behind the scenes. Click to Support Signs101 ...

AI Rant

WildWestDesigns

Active Member
That's the right answer. Lean into AI. You're either on the wave or under it.
"All that glitters is not gold" and plus that, most of what people are talking about on here is not true "AI". Most of that is marketing as it's easier for the plebs to understand it as "AI" compared to something that is more accurate. How they "learn" and how the come to their "conclusions" would lean into the fact that it's not true "AI" that we are talking about. Unless those 2 key factors change, it is highly doubtful that any of those models will achieve AGI, and that's really just a couple of handfuls of truly unique models, not offshoots of other models. I'm more worried about them becoming a "black box" compared to most anything else and the loss of knowledge that comes with that. That isn't something that I would like to "lean into". Now, I'm in the minority I'm sure as we have had brain drain due to tech for decades now and most people don't care. Ironically, creativity is one of those areas (the other is critical thinking), neither are a good thing for us to lose.
 
  • Agree
Reactions: 1 users

iam3toed

New Member
WildWestDesigns

Good points on the naming convention. We're really talking about machine learning and pattern matching. The black box concern is legit, I share that one actually. It seems like a black box already to me. No one at Anthropic or OpenAI or Google or xAI can show us a model's reason for its output. What happens when it starts deciding who gets a loan or goes to jail and we can't know why? Yikes! Biases can be trained in for sure. Hard to trust that. Not sure if that was your worry, but it's certainly mine.

Here's where I think the framing gets a little sideways... nobody in this thread needs AGI. myront is trying to get a sky image at the right pixel dimensions for a banner. Topaz Gigapixel, which runs on the same type of models you're talking about, solves that in about 10 seconds (mostly). More of a workflow question than a philosophical one.

Whatever we call it, these tools change how we do anything digitally. They're making the monotonous, soul-eating work easier. I don't think leaning into that means giving up critical thinking. I'd say it takes more critical thinking to use these tools well than to not use them at all. It takes domain expertise and a lot of time/patience to make anything worth while. And we'll have to compete with each other to make even better things, which requires critical thinking by default.

The creativity and knowledge loss is a thing, but I think the bigger risk is with the shops that opt out entirely and find themselves competing against shops that didn't. It's horse and buggy vs automobile. Not trying to call anyone a horse and buggy, but the automobile is here right now. That was the point I was shooting for anyway ;)

Good thoughts, WildWestDesigns. Glad to know you're thinking about this stuff. :thumb:
 
  • Agree
Reactions: 1 user

Bobby H

Arial Sucks.
iam3toed said:
Whatever we call it, these tools change how we do anything digitally. They're making the monotonous, soul-eating work easier.

I don't agree with that at all.

An "AI agent" can conjure up a pretty picture of a sign. What it can't do is deliver clean, production-ready vector-based artwork of the design. It can't generate shop drawings that are 100% accurate to scale. It's not going to instantly produce a scale-accurate side-view, "section detail" drawing of the sign's internal electrical features. It's not going to develop an accurate list of materials either. That's some of the soul-eating work I have to produce for certain projects in order to get an installation permit from whatever city where the sign is being installed.

From what I'm seeing with this AI crap: the bots are trying to do the actual creative work and not really doing a great job with it. The bots don't know where to begin with the actual nuts and bolts production stuff. Sign design is a whole lot more than designing a cool looking graphic.
 
  • Agree
Reactions: 1 user

iam3toed

New Member
Bobby H
I believe you're talking about using the image generation stuff, right? Like asking ChatGPT to create a section view for channel letters or a custom monument? I agree. It sucks at that right now. I wouldn't use it for that. That's trying to use a screwdriver to drive a nail. There's a better method, and it takes time to develop it. I should show you what I've built and use every day for work. That complexity you're talking about can be implemented in an app that a trained AI can use. I'm doing this right now with mapping and Braille signage. It's pretty neat. :)
 

WildWestDesigns

Active Member
WildWestDesigns

Good points on the naming convention. We're really talking about machine learning and pattern matching. The black box concern is legit, I share that one actually. It seems like a black box already to me. No one at Anthropic or OpenAI or Google or xAI can show us a model's reason for its output. What happens when it starts deciding who gets a loan or goes to jail and we can't know why? Yikes! Biases can be trained in for sure. Hard to trust that. Not sure if that was your worry, but it's certainly mine.

I do have other concerns, I'm trying not to derail things as quickly, but I do have quite a few more.
Here's where I think the framing gets a little sideways... nobody in this thread needs AGI.
No, my mention is more about what is required for actual true "AI", versus the feel good marketing version of "AI". Not on what we would actually need.

myront is trying to get a sky image at the right pixel dimensions for a banner. Topaz Gigapixel, which runs on the same type of models you're talking about, solves that in about 10 seconds (mostly). More of a workflow question than a philosophical one.
There is a lot of things that one can automate without needing to go to other people for it (I'm the type to create in house tooling for my competitive advantage, not everyone wants to, but that would at least make sure that it isn't a "black box" and not beholding on someone else to fix what is going on). Flip side is the front end work, depending on one's abilities that can be quite a deal, but it save things in the back end if one is doing it enough to actually a make it worth automating.

I have no problem with automating for efficiency, I just don't think it always needs to be done by a 3rd party for everything. But as I get older, I tend to not to like to be beholding to others, especially as far as business goes, as much as a humanly can.

Whatever we call it, these tools change how we do anything digitally. They're making the monotonous, soul-eating work easier.
Not really. This makes the assumption that what it is producing is ready to go, because unless it is ready to go, the time that it takes to weed through some of it's decisions in it's output, it might as well have been quicker/easier to do it yourself from the get go.

Why is it that people are being required to use these goods (even the ones that are in charge of programming them) as part of their job review? Humans will try to get away with as much as they can when they can on average. If these tools were as cracked up as everyone makes them out to be, we should already be getting better results compared to what we are right now and not be having them tied to job reviews.

I don't think leaning into that means giving up critical thinking. I'd say it takes more critical thinking to use these tools well than to not use them at all. It takes domain expertise and a lot of time/patience to make anything worth while. And we'll have to compete with each other to make even better things, which requires critical thinking by default.
Initially that is the way that it typically is with tech. However, past a certain point, more and more is abstracted, so there is less of a need of knowledge. Just look at Gen Z/Alpha now, most don't really know about file trees. Don't even know where files are saved (and they expect it to always be auto saved, not realizing that Ctrl+s (or Shift+Ctrl+s) was a thing. How many of them can use CLI (and that is still the best way to resolve some problems even on Windows and Mac).

In some areas, we are already getting the "AI pause". Which is essentially, people are waiting for that fancy auto correct to happen (which is in a delay of seconds, which adds up and keeps one from getting in "the zone" such as it is) and some even said that they knew what needed to be done, but started to instinctively wait until the "AI" de jour did it's thing. That's not 3 or 4 generations of users down the line, that's right now. People are using it for summaries of whatever they are looking at, I certainly wouldn't trust that, I have seen it give wrong answers until called out on (especially when it comes to code repos).

Let's not forget, in school, we have students (that aren't at that stage of knowing better) using "AI" and we have teachers (some of which are barely out of their own schooling and have only known schooling) that are using "AI" to see if students are using "AI" with mixed results. But the fact that we have students already using it as the main means, does not give me much hope, especially considering the degradation that is already being reported from people that actually do know better. Going as heavy into tech as we have in school in general has had a negative impact, imagine what it's going to be like with "AI"? There was a study done not that long ago that showed that kids learn better using analog methods versus digital (handwriting even cursive has been linked to even better reading skills as well as there is a connection with reading while writing things down and with memory, which is why all those study tips of writing flash cards etc for studying, the actual act of writing also had a thing to do with it, something that isn't achieved in the same manner as typing it).
The creativity and knowledge loss is a thing, but I think the bigger risk is with the shops that opt out entirely and find themselves competing against shops that didn't. It's horse and buggy vs automobile.
Not quite. What would be closer to your analogy is pen/paper versus computer/Photoshop. There still has to be user input. This is along the lines of making us the client and the "AI" is the artist. Typing a prompt based on certain specs, is not all that different compared to getting an email from a client with the brief (not just email, but I mention that as that is also mainly a text based mode of communication). It's shifting that process over. Going from pen/paper to computer/photoshop did not change that dynamic. Just like going from horse/buggy to automobile. The rider is still mainly in control. Using "AI", the artist doesn't have that same level of control.

Even if one totally withdrew from using "AI", very rarely do things truly die. Market share may go way done (even with horse/buggy), but things have rarely killed off the predecessors totally. I go to my did radio kill the stage, did movies kill radio, did tv kill movies? They are all still around, none are totally dead. I can go even further, did CGI kill 2D animation (just had a 2D short drawn by one former Disney animator, 3 yrs 11k drawings win an award)? Did CGI kill stopmotion (I think Netflix has one stopmotion show coming this year and the last movie that I'm aware of was from 2022)? Now, it's not like Harryhausen's day, but it's still going.
 
  • Agree
Reactions: 1 user

iam3toed

New Member
WildWestDesigns
We seem pretty close in philosophy. You create in-house tooling for your competitive advantage. I started doing that as well. I definitely prefer to create the full stack if I can.

No, my mention is more about what is required for actual true "AI", versus the feel good marketing version of "AI". Not on what we would actually need.
Roger that.

The "AI Pause" thing is gross. I can almost hear the brain cells melting away. There's no getting out of that without a really creative solution. That tech is here to stay.

I like your pen/paper vs computer/photoshop analogy. The user input point is great. I'd ask, "If there's no user input then who's driving the bus?" We're required to be part of that process. If not, that's where your black box becomes Black Mirror. That's kinda the whole point of this 3toedsoftware thing I started. I want to get AI and fleshies (haha) to start working together now before AI (learning, reasoning, problem-solving, perception, decision-making) does the whole thing.

I don't think old-school shops that don't adopt AI will completely go away. We still have a Blockbuster Video.

Ray Harryhausen is the best! Clash of the Titans!

If you're interested I can show you some cool toys I made. You may get a better idea of where things can go for the better. Lemme know!
 

WildWestDesigns

Active Member
WildWestDesigns
We seem pretty close in philosophy. You create in-house tooling for your competitive advantage. I started doing that as well. I definitely prefer to create the full stack if I can.
Pretty much where I am at as well.
The "AI Pause" thing is gross. I can almost hear the brain cells melting away. There's no getting out of that without a really creative solution. That tech is here to stay.
Unfortunately, it is here to stay. But it's here to stay because of it's backing, not really due to it's merit, that's the key distinction. Not everything new is actually worth keeping and this isn't the only thing as well.

Only thing that I can hope for is that it's newness wears off and it's "role" in the workflow is settled. To me, depending on what industry one is talking about, this is just a fancier version of live/power trace. A lot of people have embraced that, some only know that (and only the particular implementation of whatever vendor software that they are running). I never have, mainly because even in the $15k programs that I have dealt with, that functionality sucked and still sucks and will always unless the source file fits within narrow parameters that happened to work with their implementation of it.

I like your pen/paper vs computer/photoshop analogy. The user input point is great. I'd ask, "If there's no user input then who's driving the bus?" We're required to be part of that process. If not, that's where your black box becomes Black Mirror. That's kinda the whole point of this 3toedsoftware thing I started. I want to get AI and fleshies (haha) to start working together now before AI (learning, reasoning, problem-solving, perception, decision-making) does the whole thing.
Part of me is worried that it's too late. Especially with the amount that it is being used in schools and being used in replacement of the foundations (regardless if the school is pushing it or not, they really aren't going hard enough to stop it (and it's kinda hard to justify the point when they are using it as well to detect when the kids are using it)).
If you're interested I can show you some cool toys I made. You may get a better idea of where things can go for the better. Lemme know!
I always like cool toys. That's the irony, I'm a techie, but with limits. In that way, I'm more like a true luddite. They weren't really against tech, just how it was implemented, that's what their sticking point was. The definition has morphed over the years.
 
  • Agree
Reactions: 1 user

iam3toed

New Member
I keep all my stuff here -> 3toedsoftware.com
It has a membership signup thing. I didn't know any other way of measuring interest while filtering out bots. If you are aware of a better way, lemme know! Hoping by next year I'll be able to pay some bills if people think it's useful. *fingers crossed*.

There are five apps right now. They are all web apps, so you don't install anything. It's just a webpage that does stuff for you. I use Mapping Slayer almost every day. If you do lots of little signs, like ADA room plaques for offices or schools, it's awesome. If you use Chat or Claude or Gemini or Grok, they are pretty good (not perfect) at showing you how to use the program. Just give them this user guide and start asking questions -> https://www.3toedsoftware.com/ms_user_guide

I added a boatload of features, so it may look overwhelming. If so you can just toggle this POWER MODE button off (towards the top left). A bunch of feature will disappear so it's easier to wrap your noggin around.
1775323068565.png

It gets really awesome when you give AI access to these tools. I use Claude because it has browser control right now, so it will actually start using the app with you. There's a browser extension called "Claude In Chrome" if you want to try this out. It's here: https://chromewebstore.google.com/publisher/anthropic/u308d63ea0533efcf7ba778ad42da7390
I think you need a paid account to use Claude In Chrome. Maybe I should make a Youtube video showing this. Would you be interested in that?

The thing I want to drive home is that everyone that is reading this can make their own tools too. I used all four major AI's (mentioned above) to make the ones on my site. It just takes time, patience, and a thorough understanding of the problem you want to solve. If you learn how to make your own tools, you will be miles ahead of anyone who doesn't.
This is a fact.

I'm 16x faster than working manually. I'm 4x faster than using Bluebeam Revu. I know because I've kept my numbers for years now.

Also, if anyone has a problem they're trying to solve in the digital world, let me know. Maybe I can create something that solves that problem. If so, I'll put it on my site and make it available to everyone. This is my best swing at keeping our industry on the AI wave and not under it.
 
  • Agree
Reactions: 1 user

Think713

New Member
The water that re-enters the cycle must be TREATED before it can be re-used. The process is NOT cost-free.



While AGI poses its own Pandora's box of possible, extremely destructive problems the currently "dumb" AI has its own potential to utterly devastate the global economy.

Various tech companies are blindly obsessed with trying to achieve various milestones with AI and be the first to do so. Consequences be damned. "We're in a race against China" is the common rationale. The tech bros are 100% willing to let their AI tools wipe out many dozens of job categories even if the AI tech is rife with flaws and delivers sub-par results. So many people are willing to let enshitifcation contaminate so many things as long as they can make a fast buck off of it.
Treated water? How is that any different?
 

Bobby H

Arial Sucks.
Think713 said:
Treated water? How is that any different?

It costs money to treat water to make it safe to drink, use to take showers and other purposes. The water supply infrastructure costs money to build and maintain. The process is not cost-free. Data centers are not using rain water straight out of a creek or pond. They're getting their water from municipal water supply. Any city or town has a finite daily capacity of water supply. They treat only so many millions of gallons of water per day. It is not an infinite supply as you claimed earlier.

These data centers put a strain on the electrical grid and local water supply -all for little to no benefit to the local area. The only effect the local residents see is their utility bills rising. Data centers don't generate lots of local jobs. Local and state politicians are often stupid enough to give companies who build these data centers big tax breaks. Meanwhile the real purpose of these data centers is to eliminate human-held jobs. Another immediate effect we're seeing from these data centers is price hikes on computing components like RAM and graphics GPUs. I can't see enough of an up-side benefit to offset all the negative drawbacks, so I sure wouldn't want a giant data center built anywhere near where I live.
 
  • Agree
Reactions: 1 user

Johnny Best

Active Member
Las Vegas has a big treated water system, the foutains and drinking water is reused and reused. Like a large data center in the desert.
 

Bobby H

Arial Sucks.
None of that water recycling in Vegas is done free of charge. And they still have to draw whatever water they can from Lake Mead.
 

iam3toed

New Member
These is just the numbers we know. It's probably a lot higher right now.

Building it -> Training an AI model uses a LOT of electricity and water. Think powering thousands of homes for a year (~62,000,000 kWh), and enough water to supply a small town for a month (~13.4 million gallons). That's a one-time thing per model.

Asking it questions -> Each response uses about as much power as an old-school lightbulb for 20 minutes (~19 Wh), and about a shot glass of water.

If all 5 billion internet-connected people used AI regularly, we'd need several new power plants just for AI. (~345 TWh per year) ~40 nuclear power plants just for AI queries. For context, the US currently has 93 nuclear reactors total.


Sources:
University of Rhode Island AI Lab — GPT-5 energy per query
Tom's Hardware — GPT-5 power consumption analysis
BestBrokers.com — ChatGPT electricity consumption breakdown
Microsoft 2023 Sustainability Report — data center water usage during GPT-4 training
Epoch AI — ChatGPT energy estimates
The Conversation / University of Virginia — AI water footprint methodology
US Energy Information Administration (EIA) — nuclear reactor count and output
World Nuclear Association — global nuclear power data
 
  • Agree
Reactions: 1 user

rydods

Member for quite some time.
These is just the numbers we know. It's probably a lot higher right now.

Building it -> Training an AI model uses a LOT of electricity and water. Think powering thousands of homes for a year (~62,000,000 kWh), and enough water to supply a small town for a month (~13.4 million gallons). That's a one-time thing per model.

Asking it questions -> Each response uses about as much power as an old-school lightbulb for 20 minutes (~19 Wh), and about a shot glass of water.

If all 5 billion internet-connected people used AI regularly, we'd need several new power plants just for AI. (~345 TWh per year) ~40 nuclear power plants just for AI queries. For context, the US currently has 93 nuclear reactors total.


Sources:
University of Rhode Island AI Lab — GPT-5 energy per query
Tom's Hardware — GPT-5 power consumption analysis
BestBrokers.com — ChatGPT electricity consumption breakdown
Microsoft 2023 Sustainability Report — data center water usage during GPT-4 training
Epoch AI — ChatGPT energy estimates
The Conversation / University of Virginia — AI water footprint methodology
US Energy Information Administration (EIA) — nuclear reactor count and output
World Nuclear Association — global nuclear power data
The billionaires built it. My hope is that they can figure out how to sustain it.
 
  • Agree
Reactions: 1 user

Bobby H

Arial Sucks.
The billionaires are playing the usual game of taking as much as they can for themselves while taking it for granted everyone else will act honest and keep the systems of our society maintained.

One immediate near-term effect of AI-related job losses is the amputation of the talent pipe line in various industries. Replacing entry level workers with AI "agents" will cut off the bottom of the ladder employees climb to higher positions. Everyone in those upper positions eventually ages out of the work force. Every company needs a chain of younger employees to replace those who retire.

70% of our nation's economy is driven by consumer spending. Consumers must have paying jobs in order to buy stuff. If so-called AI wipes out countless millions of white collar jobs (and automation continues killing blue collar jobs) the consumption base of our economy will be far smaller. Very simple math.

Likewise, our government (including our military) is 100% dependent on employed human beings and the taxes they pay. AI bots don't buy shit. AI bots don't pay taxes. If AI bots are allowed to wipe out much of the tax base the situation will be dire in more than just an economic sense. Billionaires will have more existential worries than just finding enough people who can still fix and replace broken components in a data center.
 
Last edited:
  • Agree
Reactions: 1 user

Texas_Signmaker

Very Active Signmaker
You all worry too much about doomsday. We've had so much innovation in the last 200 years... society and life changing. This probably won't be the fall of society. Relax and get though the day... when you retire you'll be able to afford that sex robot.
 
  • Agree
  • Hilarious!
Reactions: 2 users

WildWestDesigns

Active Member
You all worry too much about doomsday. We've had so much innovation in the last 200 years... society and life changing. This probably won't be the fall of society. Relax and get though the day... when you retire you'll be able to afford that sex robot.
Directly no, but it is "doomsday" in terms of knowledge base. Already started to get studies that show the current generation is lagging behind due to all the tech in the classroom (and this was before "AI" hit it big) that for some things, tactile methods, analog methods are better at establishing pathways in the mind that help with critical thinking and creativity. So like with anything, something could be good, it depends on it's implementation that is what is going to make or break it.

We had times that people thought we were going to starve out or that resources were going to deplete and ag changed (some techniques were cheaper to do and some inputs of resources changed also due to changes in costs), so nothing new there (Thomas Malthus' predictions anyway, late 1790s I think). The thing is, we are already getting hints of too much tech at too early in the development growth of kids. So while I do not think it's Skynet level of doomsday (at least not yet), I do think we are going to have the bottom fall out of the knowledge that the younger generation is going to have, on average. That would be the "doomsday" that I would be concerned about.
 
  • Agree
Reactions: 1 user

Craig Keller

New Member
Well had an Amish customer the other day wanting something we were giving her some ideas and she asked to use our smart phone to call her Amish sister-in-law in a different state that was using chatgbt to create an idea on the artwork. Then emailing it to us as she made changes. So there you go! You really have to just deal with it if you want to stay in business these days!
 

WildWestDesigns

Active Member
Well had an Amish customer the other day wanting something we were giving her some ideas and she asked to use our smart phone to call her Amish sister-in-law in a different state that was using chatgbt to create an idea on the artwork. Then emailing it to us as she made changes. So there you go! You really have to just deal with it if you want to stay in business these days!
Amish or Mennonite? One group I wouldn't be surprised about embracing some tech, the other highly surprised, unless they weren't truly apart of that group anymore.
 
  • Agree
Reactions: 1 user
Top