"I'm not really a computer person though! That's your job!" Tech Support |
- "I'm not really a computer person though! That's your job!"
- Let's Make A Plan and Start Calling For Buy-in
- Just to be on the safe side ...
- point of sale problems
"I'm not really a computer person though! That's your job!" Posted: 29 Jul 2021 03:55 PM PDT This just happened. Client called. Can't log into computer. I try to remote in. Says computer's disconnected. I tell the client and ask them to restart. They ask what a restart is. I pause for a second, thinking they misunderstood.
Alright, we'll do it it the unpleasant way.
Eventually we gave up and they called their manager to come back in, after leaving for the day, to help them find a power button. [link] [comments] |
Let's Make A Plan and Start Calling For Buy-in Posted: 29 Jul 2021 11:11 AM PDT To give some background.. A year ago my leader mentioned that he would like to transition off an inherited SSO implementation to Azure SSO in the next three years. FFWD to 5/3/21, three months ago, our SSO account rep reaches out and reminds us of the upcoming renewal due 7/30/21. With COVID we've been cutting back on annual expenses and want to save the money, so my leader said cutover by that date. 72 configured applications for 25 different systems with 21 different user groups and requiring 90% or higher company-wide Microsoft MFA enrollment was my self-defined success criteria. With how slow healthcare IT is for changes.. impossible, right? Fuck it, let's make a plan and start calling for buy-in. I draft up a simple cutover plan, listing which apps on what dates grouped by tier, user group and expected complexity. Also write down who I need on as my A-Team, and who I need buy-in from. So I start calling up my fellow system admins. Conversations went like this "My leader wants to switch to Microsoft SSO by 6/30, I need you on-board. I'll handle all change control, cutover config, communication, and support; just need your buy-in." "That's a pretty aggressive timeline, but if you can do it, let's do it." Cool, I've got the sysadmins on-board. Time to call second-tier business owners. HR, Pharma/Healthcare admins, and such. Conversations go like "I've done some math, our SSO system costs the company 500k a year on employee-login time, renewal, server time and support. We are switching SSO providers, and are not renewing. If you'd like your application login to still function, I need you to agree to X timeline." "What do I need to do?" "I'll handle it all, just need your written buy-in." Cool, I've got the second-tier owners on board. Last calls of that day were to my A-team. The service desk manager, architect and IT marketing guy. All on-board. The next day, for completeness I setup a call with a VAR to see if they could handle the project. They sold us the original SSO product, and tout their M365 expertise all the time. The conversation was sub-par, and the quote for the work was two years of the renewal cost.. Yeah, if the project didn't complete we could "blame them", but that is a mindset for the weak. This is the point where I knew we had to handle all in-house. Over the next weeks it was documentation, CCB, implementation, and scripting. With all healthcare systems, we have many regulations and bureaucratic hoops to deal with. Authentication is a major part of non-repudiation, so I had to comb through all our audited procedures and get them updated. Thankfully I was able to delegate some of that work. Change control went smooth except for the first one. The manager of IT Project Managers pipped up when I was explaining the breadth of this project, its effects, and timeline. Long story short, they didn't like that I didn't request a PM that was going to have no idea what to do, and that we didn't have a vendor to fall back on "to blame". Implementation was all smooth, even for applications used by every employee. I tell you what.. the technical gymnastics I implemented to make a smooth HR cutover was a proud moment. The hiccup is when we got to the MFA required applications. Remote access. This whole time, my A-team had been communicating to end users about the cutover, and the requirements around MFA. Handling any and all issues, and providing exceptional support. Week 1, jump from 1% to 40% enrollment. Week 2-5, up to 50%. We needed 90% MFA enrollment by week 7. I had built a script to help visualize enrollment, and create a distro of those not enrolled. So day after day we blast those who haven't enrolled, communicating procedure and deadline. Week 7 we are at 75%. Thank god week 8 wasn't the actual deadline! I've still got 4 weeks to get this done. Week 8 - these people just don't want to enroll; pushed my timeline back for remote access apps. Time to alert their managers. Modify my script to pull their manager too. Write a new script to blast an email to the managers detailing % of their team enrolled, and name those who haven't enrolled. 85% enrollment. Week 9 - working for the last 5%, and I start 2 weeks of PTO next week (why I wanted an 8 week implementation). Blast another email to the managers warning them of potential work-stoppage, and the people specifically. Managers were obviously not happy with their employees. Jumped to 96% enrollment. Time to cutover. Week 10 - I'm going on PTO y'all. Bout time to document and have our first official meeting. We set responsibilities, walk through what is left, who to talk to and how to implement, passed mostly to my architect. As expected, implementation hiccups. Week 11 - Fixing hiccups and standing up some internal architecture for Citrix. Week 12 - Full remote access SSO+MFA implementation. The 4% not enrolled are mad, but they had months of warning. Week 13 - Everything is done, and we the sysadmins are happy. HR was surprised it went so smooth, and our CIO and CISO is over the moon about the SSO implementation. The only thing I will change in the future is more delegation. Hopefully the skills in my team will be up to par to handle the aggressive timelines we are consistently expected to achieve by that time. Working in IT Security is fun. [link] [comments] |
Just to be on the safe side ... Posted: 29 Jul 2021 02:23 PM PDT Back when RAM was much more expensive than it is now, an application administrator was dealing with a performance issue. After investigation, he concluded that the problem was a memory shortage on the VM hosting it. He estimated what the application would need, adding a little bit on the top to be on the safe side. He then talked to the team lead, saying we need this much extra RAM. The TL then talked to the project manager and asked for extra RAM, adding a little bit to the application admin's figure - just to be on the safe side. The PM then talked to the OS support guy (me) and asked for extra RAM, adding a little bit to the TL's figure - just to be on the safe side. I added some RAM to the VM - with a bit extra over the PM's figure, just to be on the safe side. A while later, the application admin and I were comparing notes and discovered that what he'd been given was way more than he'd asked for - and realised every step in the chain had added a bit extra. I'm a bit fuzzy on the details now - it's been a while. But I think the numbers were something like: application needed 6GB, VM got 11GB. [link] [comments] |
Posted: 29 Jul 2021 08:08 AM PDT Got a call one morning from a store manager complaining that the store was supposed to open in about 5 minutes and none of the registers would open. I asked that he go to the computer room and tell me what lights he sees on the computer racks that contain the computers, networking equipment and UPSs. He tells me that there aren't any lights. even the room was dark. Suggested that he turn the room lights on and then verify that the computer racks are plugged in as well as their dedicated circuit breakers are on. The breakers are on but the racks are unplugged. There is an industrial coffee maker plugged into the computer outlets. The stockers that come in at midnight would unplug the computers and then plug in the coffee machine (don't need the computers to stock shelves or clean floors). Then unplug the coffee machine and plug the computer racks back in when they leave. That morning, they forgot. Had the store manager unplug the coffee machine and plug the racks back in. In about a minute the network equipment started coming up and about a minute or two later the computers started. That enabled the registers to open and they could service the patrons. Told the store manager to get an electrician out there to put in another circuit (not on the backup generator) just for the coffee machine. No more problems. There have been some interesting questions posted below. I will try to answer some of them here. This was in the days before Electronic Shelf Tags, so the night stockers had to have access to the room to print and get shelf tags. The store manager did not get the electrician out there, but I did generate the work order to have it done. So no more midnight outages. The electrician also moved the coffee maker out into the hall so it was not next to the computer and network equipment. [link] [comments] |
You are subscribed to email updates from Tales From Tech Support. To stop receiving these emails, you may unsubscribe now. | Email delivery powered by Google |
Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043, United States |
No comments:
Post a Comment