Yeah. I find that IT doesn't really like to implement any technology other than what they absolutely have to such as providing PC's that include Microsoft Office and E-mail. Anything beyond that is out of their comfort zone and they try to avoid it. Like we are doing them favors by using their computers when they wouldn't even have jobs if the users didn't need to use PC's for actual productive work.
I especially like your last sentence "He ended up having to create a new account that he claimed was more secure and then we put that account on the systems." and I emphasized the most interesting/telling word in there.
I smiled when reading Hooovahh's post, then frowned when reading crossrulz'. My group has this argument all the time, and I maintain that data that isn't available to other functional groups is next-to-useless. It often means that I'm the gatekeeper and everybody is justified in asking me to answer whatever crazy (or reasonable) question they have. Or, worse, it means that the data isn't aggregated/tracked/examined once the acceptance test is passed.
I've had people suggest building an "engineering" network that we control. Good grief, Charlie Brown!
What solutions have you implemented that could get me down off of the ledge?
Reminded of the customer that would grin at me after I acted as the intermediary between him and his projects and IT and say "That is why I pay you to do it Ben!"
For what it is worth take names and keep phone numbers as you navigate the IT-sea. Even if the "person that knows what they are doing" retires, invoking their name may help getting through to someone that can help.
A couple months after starting at a company I was asked to estimate how long a project would take. At this place, we needed an administrator login to install any software or drivers. It was kind of annoying, because I was working with a lot of new hardware, and each time I was installing new drivers for them I'd have to hunt down the IT guy. This could postpone me quite a bit if he had gone home for the day or was busy in meetings. So for a project to add 2 more instruments into an application I might have estimated 2 weeks, or 1 week if I had an administrator account. So my account was made as a local administrator on my computer and I could install drivers to my heart's content 🙂
Oh that reminds me of another one that I don't remember if I've shared. So I was working at a place that had super locked down security. Hard drive locking password, Windows password, 2 factor authentication with an RSA key, and a few other steps for doing VPN, but at least all these things worked. Pretty early on I needed to install the NI suite of software but the account I had was locked down. The process to install anything would be to request a temporary administrator account that would be created after approval. Then you'd get an account that would work only for 1 hour and if you didn't install everything in that hour you would have to request access for an administrator account again. Several of the people I worked with just said, to turn your local account into an administrator within that 1 hour window and then your account could install anything you wanted.
Interesting. Our company has instituted this "superadmin" policy where you need the superadmin password which gets reset automatically every week. I think I'd be able to turn my regular account back into an admin account that has actual rights to install software again, but I don't know how long that would last as they also have a GPO which periodically audits PC's and resets things like that back to the original useless "won't let you do your job" access rights.
Speaking of LabVIEW and security. I worked for a company that had an overzealous set of antivirus settings. I was working on a webcam emulator to serve a simple and testable image to a device under test. I went to compile, then couldn't find the executable. After several times and thinking I was going crazy, I finally figured out that the antivirus was quarantining the software immediately. In hindsight, it's probably because it was designed to read a file and send UDP packets. Understandable but annoying to get around.
It was an interesting project because many of the utilities and tools I had to use involved low level networking. I immediately had to buy an off-network-non-IT machine. To this day, I cringe at the thought of corporate security touching my test sets (between this and virus scans and forced updates messing up critical timing)
To this day, I cringe at the thought of corporate security touching my test sets (between this and virus scans and forced updates messing up critical timing)
Even worse, we recently had a test system do a force reboot (after IT security update) in the middle of a test run. Needless to say, that system was built before I started here.
The way we do testing around here has changed in the last few years to avoid some of these issues. We used to have test machines that were Windows running the test and doing report generation giving more or less single point controls to other test equipment. Now we've moved to using an embedded cDAQ controller running Linux RT for the test execution, and the Windows PC is just there for status, and report generation.
With the advent of Windows 10 and forced reboots (at least on non-domain controlled PCs), and various other IT interventions, we found this to be the most stable way to do things. So here we are using Linux RT for the reliability, and less for the deterministic capabilities. If the PC is restarted for whatever reason, it logs into the local user, and on startup runs our software, which automatically connects to the remote controller. I come in and the tester looks like what I left it at, even though it might have gone through reboots. It certainly increases the cost of things but management here is fine with that as long as IT and headquarter policies don't interrupt testing.