• I want to thank all the members that have upgraded your accounts. I truly appreciate your support of the site monetarily. Supporting the site keeps this site up and running as a lot of work daily goes on behind the scenes. Click to Support Signs101 ...

That Time of Year Again

WildWestDesigns

Active Member
I just can't really stress enough to keep your production rigs off the internet. Particularly with how Win 10 is going.

It always seems like I'm trouble shooting my dad's computer and then I find this delightful bit of news.

I just can't say it enough, if you are running Win 10 and it's handling production, keep it off the WAN. LAN is fine, WAN is no bueno.

OS updates should not be doing what they are doing when it comes to Win 10 updates. Especially since they are now for all intents and purposes forced updates.
 

Bobby H

Arial Sucks.
Generally I just don't trust computers for storing any important data, regardless if the computer is connected to the Internet or not. I'm pretty anal retentive about backing up data I want to protect/keep on 2 or more external hard disc drives. If you don't have the data stored safely in 2 or more places you don't have it stored. This includes keeping a copy stored off site. You can make multiple back-ups, but all those back-ups will be lost if your work place burns down to the ground.

It's bad enough that a Not Ready For Prime Time operating system update can gum up the works. Windows is not the only OS afflicted by this. But hardware failures are always a threat too. If I have to keep art/production files on a disc inside a work computer I usually make sure it is a different physical hard disc from the one holding the operating system. If an errant update or something else hoses your OS it's not hard to start over with a clean "system restore." The only thing you lose is maybe a couple hours of time.

I love using notebook computers for their convenience and portability. But most new notebooks are big on using solid state hard discs. Those things are great in terms of speed. But they are a liability for long term storage. When a traditional hard disc with magnetic platters starts going bad you'll get some tell tale signs that it needs to be replaced. You get no such warnings with a SSD. One minute it's working 100% perfect. Next minute it is dead and so is all the data on it. Without warning.

External flash drives are popular for their portability. But they absolutely stink for storage reliability. I won't spend any more than what it costs for a modest 32GB USB stick. There's no telling when one of those things will go bad. One stick will work like a champ for several years while another goes bad in less than a couple months. That even goes for the highest priced "ultra" versions.

If you use Adobe software, particularly Creative Cloud, there's little choice but to have your computer connected to the Internet. The Typekit service (which is a pretty awesome bonus) only works with an "always on" Internet connection. The rest of the software you install has to be able to occasionally "phone home." Other applications are doing some of the same stuff. There's a lot of cloud storage services which demand Internet connections. I use an iPad Pro and Adobe's "mobile" apps. I can only use my Creative Cloud folder, Dropbox folder or iCloud folder to move data between my iPad and work computers. That requires an Internet connection.

So really in the end the best thing to do is regularly back up your data to hard drive volumes not connected to your computer and have those volumes stored in more than one place. Malware does not always need an Internet connection to hose a computer. The payload can be delivered via a USB memory stick or CD-R provided by a customer. Or it can infiltrate your home or work network. If I got hit with a ransomware attack right now I would only be angry and annoyed for the time it takes to reformat the hard drive and re-install software. I wouldn't lose any important data.
 

WildWestDesigns

Active Member
Generally I just don't trust computers for storing any important data, regardless if the computer is connected to the Internet or not. I'm pretty anal retentive about backing up data I want to protect/keep on 2 or more external hard disc drives. If you don't have the data stored safely in 2 or more places you don't have it stored. This includes keeping a copy stored off site. You can make multiple back-ups, but all those back-ups will be lost if your work place burns down to the ground.

When I was talking about keeping it off the internet, I was talking more about software not necessarily working files. While this update seems to affect more of files, I was just using it as just yet another reason of why always getting updates is not a good thing.

It's bad enough that a Not Ready For Prime Time operating system update can gum up the works. Windows is not the only OS afflicted by this. But hardware failures are always a threat too. If I have to keep art/production files on a disc inside a work computer I usually make sure it is a different physical hard disc from the one holding the operating system. If an errant update or something else hoses your OS it's not hard to start over with a clean "system restore." The only thing you lose is maybe a couple hours of time.

No, updates are a double edge sword for any OS and/or program (depending on what is getting updated). I do believe that Windows does tend to be the most afflicted. Which I can understand to a point. Forcing updates erodes that understanding though.

2 hrs for a system restore? Man, I'm truly loving not running Windows (or Mac) anymore on bare metal. 2 hrs, even on a fresh OS install, I'm firmly back into work. 20-30 minutes to get everything up and running on a fresh OS install for me. If I was better at scripts, I could probably get that trimmed down even further.



External flash drives are popular for their portability. But they absolutely stink for storage reliability. I won't spend any more than what it costs for a modest 32GB USB stick. There's no telling when one of those things will go bad. One stick will work like a champ for several years while another goes bad in less than a couple months. That even goes for the highest priced "ultra" versions.

Jump drives are designed to transport files from point A to point B, not really for long term storage. Having said that (I do get high storage capacity then what you mention), I do use jump drives for portable programs (all my production programs are portable (and sand-boxed)) and for large files. If I'm traveling with the family, most hotel TVs have USB ports, so I'll use them for video files as well. I do also tend to run live OS installs on USBs as well.



If you use Adobe software, particularly Creative Cloud, there's little choice but to have your computer connected to the Internet. The Typekit service (which is a pretty awesome bonus) only works with an "always on" Internet connection. The rest of the software you install has to be able to occasionally "phone home." Other applications are doing some of the same stuff. There's a lot of cloud storage services which demand Internet connections. I use an iPad Pro and Adobe's "mobile" apps. I can only use my Creative Cloud folder, Dropbox folder or iCloud folder to move data between my iPad and work computers. That requires an Internet connection.

I stopped with CS6, I stopped using Windows when Win 10 came out. I only have one tablet (Cintiq tablet) that has Windows installed on bare metal. All of the others run Linux and I do have a VM or two running of Windows (isolated from outside connection).

All the applications that I run are not only portable apps (AppImages, even with programs that don't come in a Linux portable version from the devs (like Inkscape), the joys of open source) and I have them further sand boxed beyond that. The libraries that they use are in the AppImage itself, so they don't even call on system libraries.

I tend to use my own NAS storage, I'm not a fan of 3rd party cloud backups. Maybe as a tertiary redundant backup, but not one of the 2 that one needs to have to be a backup. I run NAS at my place (actually 3), sister's, niece's and parents. The one at mine does not read NTFS file systems (I made sure of that), so those that may be at my place that run Windows computers won't even pick it up as a DLNA server.

The payload can be delivered via a USB memory stick or CD-R provided by a customer. Or it can infiltrate your home or work network.

Those are a none issue as I don't get jump drives or CD-Rs from customers (with it appearing to me that optical drives are going by the wayside, I doubt CD-Rs will be much of an issue for much longer for most people).

Malware can actually try to attack your desktop PC (by this I mean any x86 computer that can perform tasks locally (this does include Macs as well)), but that same malware can go after your router as well. I'm more worried about IoT vectors more then I am through traditional means as end users are totally at the mercy of the vendor to support those devices and keep them up to date.

Having said that, I actually have 2 networks at my place. One that has WAN access, anything that comes in there is scanned put on a USB drive and then put on a computer that is apart of the LAN only. Those are the production computers. Process is reversed if the file heads back on.

With the USBs and the CD-Rs and how those are handled, I blame MS on that one. Over the years, they have always done convenience over security. For USBs, CDs, even printers. I have things setup that there are some hoops to jump through to even just a printer setting from B&W to color.

If I got hit with a ransomware attack right now I would only be angry and annoyed for the time it takes to reformat the hard drive and re-install software. I wouldn't lose any important data.

That's really the way it should be. Ransomware should not be a thing. Period.

The fact that it was able to cripple hospitals and businesses, kinda ticks me off. That tells me that their IT staff wasn't doing their job.
 

WildWestDesigns

Active Member
can you not cut updates off on windows 10?

That would be called keeping your computer off the WAN.

For the pions that would have certain Win 10 versions (you and I) at best we can defer updates. Can't totally turn them off. There used to be workarounds, but over the course of the updates, those workarounds no longer work. So if you bought a new computer now with Win 10, I highly doubt you'll be able to use it unless you happened to get old stock.

Enterprise customers, I believe has to approve updates. For the most part, pions like us won't be able to legally get the Enterprise version.

There may be 3rd party solutions that block the updates. Unfortunately, like some 3rd party solutions to block the Get Win 10 update, I would be afraid that they would deliver malware. Besides that, there should be built in easily user discoverable options to disable updates.

Unfortunately, there is a current downside to stopping updates. MS only supports the latest Win 10 updates for 18 months of release (not when you get it, from the date that they release it). Effectively making it a rolling release distro. For me and these versions of Windows, it has effectively lost it's enterprise status (and support) OS. Mac is better as I believe they support their versions for 3 yrs (I could be wrong, might want to check on that), however, ironically there are a lot of LTS versions of Linux that have 5 to 10 yrs support for an individual release. For production environments, that's where it should be.

Now another downside to this and that's with the software angle. For those that are on Adobe CC, they just made a statement that the next upcoming version will have programs that will only run on the latest versions of Win 10. 2 of those would have affected me if I was on the CC programs (Audition and Premier). They wouldn't run on Win 7, 8.1 and earlier versions of Win 10. And this does affect Mac users as well. I should note that Win 7 and 8.1 have not reached EOL. I could understand it if they had reached EOL or extremely close to it, neither are. So here you are, paying subscription in order to have the latest greatest software and you may have to upgrade hardware and OS before you budgeted for it in order to keep being on the latest and greatest. So it's either stick with the last version that works (which brings back to one issue of staying with perpetual licensed programs) or upgrade everything (regardless if you are ready) to be able to use the latest and greatest.
 

WildWestDesigns

Active Member
Hmmm, just read an article from ars that was out yesterday evening, apparently this issue was reported in the testing group, but was hard to pinpoint there. And there is no commonality for what's happening when it was released into the wild (as in the testing group). However, in my mind, if something is happening in testing, it should have been looked at there before the update is sent. That is the point of testing.

Now, I do believe (as Bobby mentioned) backups (true backups) should be done. So the consumer is partly to blame for how much this has affected them as well. But if a vendor is going to force updates, they have to be impeccable. No doubt in my mind. And that's not always going to happen. Even Apple, that has a tighter grip on hardware/software combos, still has issues, even one with the rollout of High Sierra.
 

Texas_Signmaker

Very Active Signmaker
Hmmm, just read an article from ars that was out yesterday evening, apparently this issue was reported in the testing group, but was hard to pinpoint there. And there is no commonality for what's happening when it was released into the wild (as in the testing group). However, in my mind, if something is happening in testing, it should have been looked at there before the update is sent. That is the point of testing.

Now, I do believe (as Bobby mentioned) backups (true backups) should be done. So the consumer is partly to blame for how much this has affected them as well. But if a vendor is going to force updates, they have to be impeccable. No doubt in my mind. And that's not always going to happen. Even Apple, that has a tighter grip on hardware/software combos, still has issues, even one with the rollout of High Sierra.

I read they didn't even test the build that they released
 

WildWestDesigns

Active Member
I read they didn't even test the build that they released

Well, I had read that there were some layoffs so some of the QC wasn't was going to be there (not that it really is there for a lot of their updates), but if that's true and they are pushing updates without testing, that's even worse.
 

Bobby H

Arial Sucks.
Microsoft needs to stop it with the forced updates for Windows 10.

I can understand their reasoning for pushing these updates. Many computer users are very lazy and simply will not install updates on their own even if the update eliminates a very critical security hole or fixes a really terrible bug. Those users will not install updates until some kind of disaster strikes.

Unfortunately Microsoft's own people (and sub-contractors) are far from perfect. Too often an update to fix one thing will break something else. And Microsoft is guilty of pushing out these updates before they've been properly tested.

This latest update fiasco puts more light on an extremely common problem: many computer users absolutely suck at managing their files. They just let immense amounts of data pile up in the "My Documents" folder. They don't organize anything (much less back up any of it). The bug in this new update sometimes deletes all the files in the user's "My Documents" folders (files, pictures, videos, etc). I thought I read something about it possibly deleting files in other folders on the boot drive as well.

WildWestDesigns said:
2 hrs for a system restore? Man, I'm truly loving not running Windows (or Mac) anymore on bare metal. 2 hrs, even on a fresh OS install, I'm firmly back into work. 20-30 minutes to get everything up and running on a fresh OS install for me. If I was better at scripts, I could probably get that trimmed down even further.

If someone is doing a truly clean re-install of Windows that user has to re-install the OS, re-install applications, plug-ins, drivers, fonts, etc. Unless the user is running some kind of backup/restore software (which has its own pros and cons) there's no way to avoid burning a good chunk of time getting a computer up and running again from scratch.

WildWestDesigns said:
Now another downside to this and that's with the software angle. For those that are on Adobe CC, they just made a statement that the next upcoming version will have programs that will only run on the latest versions of Win 10. 2 of those would have affected me if I was on the CC programs (Audition and Premier). They wouldn't run on Win 7, 8.1 and earlier versions of Win 10. And this does affect Mac users as well. I should note that Win 7 and 8.1 have not reached EOL. I could understand it if they had reached EOL or extremely close to it, neither are. So here you are, paying subscription in order to have the latest greatest software and you may have to upgrade hardware and OS before you budgeted for it in order to keep being on the latest and greatest. So it's either stick with the last version that works (which brings back to one issue of staying with perpetual licensed programs) or upgrade everything (regardless if you are ready) to be able to use the latest and greatest.

Adobe has done this kind of thing before. The first versions of Creative Suite would only run on Windows 2000 and later. They wouldn't work on Win95, Win98, WinME or WinNT.
 

Texas_Signmaker

Very Active Signmaker
Well, I had read that there were some layoffs so some of the QC wasn't was going to be there (not that it really is there for a lot of their updates), but if that's true and they are pushing updates without testing, that's even worse.

Yeah, apparently they did not test it with either Fast Ring or Slow Ring...just pushed it out to the masses.
 

WildWestDesigns

Active Member
This latest update fiasco puts more light on an extremely common problem: many computer users absolutely suck at managing their files. They just let immense amounts of data pile up in the "My Documents" folder. They don't organize anything (much less back up any of it). The bug in this new update sometimes deletes all the files in the user's "My Documents" folders (files, pictures, videos, etc). I thought I read something about it possibly deleting files in other folders on the boot drive as well.

That's new to me, but I wouldn't be surprised.

Users suck at managing their own files and files of others as well. Companies are suckered into ransomware shows this. Having encryption keys very close to the encrypted files themselves also shows this.

This is unacceptable. As involved as computers are in our lives (business and personal), this is unacceptable. My entire life has been around the PC. I have messed with DOS on bare metal when it was new and being actively worked on until Win 10 (hated XP, much more then Win 10 itself, I just don't like MSs policies with regard to Win 10 more then disliking the OS itself). About the only MS OS that I haven't messed with is Xenix. This is also a time when Apple computers were also considered PCs (and to me, they still are). Although I would argue with Win 10, a PC is no longer a PC. People are still falling for the same crap that they did during the 9x era.

If someone is doing a truly clean re-install of Windows that user has to re-install the OS, re-install applications, plug-ins, drivers, fonts, etc. Unless the user is running some kind of backup/restore software (which has its own pros and cons) there's no way to avoid burning a good chunk of time getting a computer up and running again from scratch.

Yep, 20-30 minutes on a fresh install of Linux and getting everything back up to snuff. That's with "re-installing" applications (remember, portable programs), plugins (which are actually also bundled into the portable AppImage (I typically have to do that on my own)), fonts, drivers (really only one, Nvidia driver and that's on my 3 monitor setup, which is easy enough to do, Wacom driver is actually bundled into most distros (sans Suse, Arch)).

All of that extra install stuff could be scripted into a Bash script (cutting down the time), but my script skills aren't what they should be.

Adobe has done this kind of thing before. The first versions of Creative Suite would only run on Windows 2000 and later. They wouldn't work on Win95, Win98, WinME or WinNT.

This actually happens in all software.

The biggest difference here with subscription software is that one is still paying to be on the latest and greatest and may not be until they upgrade their system. That's the biggest difference here, still paying for the latest and greatest and have to hold on using it until they upgraded their system.

Between the subscription software and the rolling release nature of Win 10, planned obsolescence is even more so. And yet end users are still paying and paying and paying.

Fwiw, I really hate win10.

So, I take it you aren't going to be signing up for any MMDs?
 

WildWestDesigns

Active Member
Ill just sit back in my corner with my windows 7 and my adobe illy disc and be content

That'll work for a good bit, but eventually you'll either be keeping older computers working (which progressively gets harder (mainly in terms of finding parts) and costlier to do) or you'll have to start VMing the older stuff.

Then, of course, buy newer equipment that requires newer software/drivers etc that'll throw a wrench in the plans as well.

Don't get me wrong, the only way that Windows is going to touch any of my computers from here on out (unless something not of my choosing compels me to do otherwise) is going to be through a VM.
 

Bobby H

Arial Sucks.
The obvious planned obsolescence cycle is annoying. But I get it. The computer industry is a business. They're not going to survive by selling a customer one piece of computer hardware or a box of software and making those products last where the customer never has to re-buy any of it ever again. Not many of us would be able to keep our jobs in sign companies if customers didn't have to buy new signs every few years or so either.

With all that being said, when I look back a couple or so decades to how computers and software was back then I really have to say computer customers have it easier today. Mainstream hardware back then was more expensive and didn't have nearly the life span of today's hardware. It's not difficult to get 10+ years of life out of a good (not high end) desktop computer and 7+ years of life out of a good notebook. That sure wasn't the case back in the 1990's.

Longer hardware life has to be one of the reasons why so many companies remove a lot of backward compatibility. They want to push you into buying new hardware and software. The days are long over where many users deliberately bought new PCs every couple or so years to have the latest-greatest stuff. Now they hold onto old PCs until they break and cannot be fixed. It won't be long before mobile phones fall into that zone; each new yearly phone release reveals ever diminishing returns. The upgrades aren't wowing people anymore. Neither are the sky high prices.

I understand the hatred against subscription-based software, Adobe's Creative Cloud in particular. The Adobe stuff is pretty integral to our work flow. There's enough new features and other bonus goodies (such as Typekit) to make it a pretty good deal. On top of that, some outstanding Illustrator plug-ins (like those from Astute Graphics) work on CC-level versions of Illustrator.

One thing I really do not like: companies removing support for old files. Corel goofed big time by removing support for CDR files made in version 5 or earlier. The last couple or so versions of CorelDRAW can't open such files (or import them either). I can open 30 year old AI 88 file in Adobe Illustrator. I should not have to keep a vintage computer around or run virtual machines just to open old archive files.
 

WildWestDesigns

Active Member
That sure wasn't the case back in the 1990's.

I've got a Win 95 Toshiba laptop that still works.

It won't be long before mobile phones fall into that zone; each new yearly phone release reveals ever diminishing returns. The upgrades aren't wowing people anymore. Neither are the sky high prices.


If you are a Mac/iPhone user, it's already there. In fact, Apple is trying to accelerate this process by removing what it means to do your own repair work (or go to a qualified 3rd party that isn't Apple).

On top of that, some outstanding Illustrator plug-ins (like those from Astute Graphics) work on CC-level versions of Illustrator.

Unless it's recently changed, they also still work with CS6 (what I have them on when I got them) and CS5. CS5 may have been dropped. I have an older set that works on CS5 in another VM.


One thing I really do not like: companies removing support for old files. Corel goofed big time by removing support for CDR files made in version 5 or earlier. The last couple or so versions of CorelDRAW can't open such files (or import them either). I can open 30 year old AI 88 file in Adobe Illustrator.

This is another thing in planned obsolescence as well. It's not new. I have had files though from Adobe (mainly Ps, but it has happened in Ai as well) that didn't open in the new version. I think it really depends on what tools were used and if they had a major re-write between the legacy version and the more modern version.


I should not have to keep a vintage computer around or run virtual machines just to open old archive files.

As I said the last time that you pointed that out, you don't have to. There are open source software that can read those older files, so one doesn't have to keep a vintage computer around or run a VM of said legacy software.

Now, again, this is not a recommendation to go to an entire open source workflow (while I'm lucky enough to have been able to do so, it isn't feasible for a lot of people), it is only a suggestion of another option beyond the false dilemma presented in the above quote.
 

Bobby H

Arial Sucks.
False dilemma? It's very much a for real problem. Corel at some point a few years ago decided it was going to cut off file open/import support for files made in version 5 and earlier. This is very different from Adobe Illustrator having problems opening old archive files due to missing plug-ins. In the case of CorelDRAW it simply won't even try to open the old files at all. This goofy move penalizes its longest term customers. What's next? Are they going to start cutting off file open/import support for later versions, such as version 9, X3, etc?

Whether one has an old PC with an old copy of CorelDRAW or is running a VM on a new one, it's not very practical to go through extra steps with that stuff just to pull up an old file. It sure isn't practical to pull up dozens, hundreds or even thousands of files in an old version of a graphics application and save them all forward a few versions.

Going "open source" isn't a bullet proof route either. Builds of Linux and the applications running on them are just as susceptible to this problem. Having to pay or not pay for the software doesn't change that. Fallible human beings are still involved developing those applications. They can still make "I know what's best for you" decisions that create problems for users. Or their applications can die off very much like any commercial application can. Many sign shops are pretty dependent on software from Corel, Adobe or even both. Very often you need those host applications, such as a current version of Illustrator, to open client art files without errors.
 

WildWestDesigns

Active Member
False dilemma? It's very much a for real problem.
It's a false dilemma in that your only options are running legacy hardware with legacy software or VMing. I mention yet another option. Your talking about something else. I'm not debating that it's an issue, it's an issue with all program, what I'm talking about that is the false dilemma in that there are only 2 options to remedy that situation and that's what I'm calling a false dilemma.

Also, I am going to mention AGAIN, this was not meant as a recommendation of going totally open source. Not everyone again or with varying degrees of success.

However, getting a legacy program to run on a newer system is not a difficult task if the program is portable (all of my production programs are (and cross platform as well, so they do have Windows versions)) or the nice thing is that you can just build that legacy open source program on a newer OS. Far more options. Some really easy, some not so. I will say this, it's easier to build from source on Linux then on Win or Mac.

What Open Source does allow you to do though is to use that older legacy version (providing the arch is still good, some are multi arch) and you can get it to work on a newer system. Closed source programs do not have such ability. So even if someone that programs an open source version down the road that is no bueno, chances are you can still get that legacy version.

I've got an open source digitizing program that didn't make it out of the Win 98 days, but still works in Win 10. Some programs back in the pre-SE days of Win 98 don't work on 64 bit systems. Their installation stub is 16 bit, 64 bit Win systems have no 16 bit code. It either doesn't work or the user finds a way to swap out that install stub for a 32 bit one (this is a problem even if the program itself is 32 bit, but the install stub is 16 bit).

It's not perfect and no, nothing will be, but you are options are greater of getting it to work then compared to closed source programs that are installed in the traditional sense with an installer.

However, having said all of that, my suggestion was not going all out on open source, it was presenting another option in getting those legacy files to work without having to either keep around older software/hardware or VMing (again the false dilemma).
 
Last edited:
Top