• Breaking News

    [Android][timeline][#f39c12]

    Thursday, November 4, 2021

    System issues cause a single mother panic, I look deeper Tech Support

    System issues cause a single mother panic, I look deeper Tech Support


    System issues cause a single mother panic, I look deeper

    Posted: 03 Nov 2021 01:31 PM PDT

    Hey all! You all seemed to like my previous posts so here's hopefully another heart warmer for all my techies;
    So for some context I work in a broadband/TV/phone company in the technical department but I can also help on the billing side too to an extent.

    A lady called in and immediately she didn't seem okay, she said we'd taken over £100 from her bank after she cancelled one of her services with her and that she was a single mother and no one had informed her we'd be taking that payment. Without that money she wouldn't be able to pay her bills, feed her kids or buy petrol for her car.

    At first glance the account seemed to be right; she'd only had broadband for a few months with us so of course there would be a fee, but she insisted she'd been with us longer and she was right! I found emails about her service that the account didn't say she had that she wouldn't of gotten if she didn't have it! She shouldn't of had any cancellation fees as there was even an email saying she was no longer in contract! It was a big system error that had caused it, and within my scope of digging up information.

    I told her while I could make no promises at this time, I was going to escalate it to my manager and we were going to see if there was anything we could do. She was understandably worried but I promised her I would call her back. Armed with all the details I sent it all to my manager. Within an hour we had a response from the higher ups;

    Give her her money back

    Also knowing she didn't have much money in the account and it could take a few days my manager and I agreed we should give her a voucher for a supermarket so she could feed her kids and use the money she did have for emergencies until the money came in which could be sent immediately. I called her up and told her what we were going to do and the poor woman was nearly in tears. She thanked me but I thanked her for calling us and giving me all the information I needed to help her and that she'd done the right thing to chase this.

    Considering I had a very difficult call afterwards with another customer where I was sworn at and called 'a waste of space', I'm at least glad I managed to help her today.

    submitted by /u/ASillyFace01
    [link] [comments]

    Compromised - Worst Nightmare to Greatness

    Posted: 03 Nov 2021 02:11 PM PDT

    Some information removed or changed to protect identities, etc. Also, this posts discusses in high level detail an attack on our network. I was not the one to launch the attack (I was one of the defenders).

    Within the government organization I led the internal IT Security Team, spearheading organization-wide changes to our security stance. Aside from the changes I was leading, we also had a contracted MSSP Team that monitored the network 24/7. The escalation point for "serious" or "critical" incidents was me. Barring me, it fell to the Senior Sys Admins in order of seniority.

    For the previous couple weeks we'd seen a few other City governments be targeted by state-sponsored attackers based out of Russia. We were briefed by higher levels of Government. We constantly hammered at users for their training, and we weren't afraid to disable accounts until they completed it. But, users will always remain one of the weakest links in Security. Internally we had a sit down conversation about the "what-ifs". "What if" we are targeted? "What if" our MSSP catches it late? "What if" <whatever>? We had a conversation with our MSSP and re-highlighted escalation procedures.

    1. Call me.
    2. Failing that - call Sr. Sys Admins.
    3. Failing that - call Management.
    4. Failing that - call Dispatch.
      1. Explain who you are, why you're calling, what you need, and how quickly. Dispatch will handle the rest.

    All was well and good. We went back to the grind of stripping old hardware, OSes (Guest + Hypervisors), Software, Networking gear. I had a sinking feeling in my stomach though. I knew something was coming, I had a good idea what it was. I didn't know when it would be. Now - our network, previously being old, and mostly insecure had come a long way. Most of our servers were onboard for long term monitoring, and had a modern EDR solution. More accurately, all of our servers had been replaced by modern systems. Except one.

    One night (3AM) I get a phone call. It's from our MSSP.

    $MSSP: "Good Evening, it's $Tier1 from $MSSP calling."

    $Me: "Haha, I don't think 'good evening' is a good use of words if you're calling at 3AM! I can appreciate the sentiment though. What's happened?"

    $MSSP: "There are currently 6 accounts in your network actively attacking hundreds of your servers from <IP Range>. Initial indication is they just gained access about 10 minutes ago."

    $Me: "That IP range belongs to our full tunnel SSL VPN which is geo-restricted to our country. Where are these guys logging in from?" (Unfortunately at the time, MFA wasn't being enforced at the VPN. Management was looking to spare a buck. Rest assured, multiple Senior Sys Admins threw books at them later.).

    $MSSP: "$MajorCityInCountry."

    $Me: "We don't have anyone there right now. Send me an email ASAP with their usernames. I'll disable the AD user objects and kick the VPN sessions. Please run a search of the last 14 days for any VPN session that authenticated regardless of location. Group by the columns of Username, Source IP, and geographic location. Also, you mentioned exploits. Is this an attacker throwing shit at the wall and hoping something sticks? Run a second search please for our firewall, last 14 days, any UTM event that wasn't dropped/blocked. Export on Source IP/Port, Dest IP/Port, Username, UTM action and the UTM event itself."

    $MSSP: "We'll export that information and have it over to you in the next 5 minutes. Anything else?"

    $Me: "Check the AV and EDR logs. There are plenty of shells that would make it through the UTM checks at the firewall that may be caught based on behaviour. If they attack FS01, they're going to find something."

    $MSSP: "One just came in - meterpreter shell on FS01 to <C2 IP>. ". (Caught by non-EDR software. It was the last machine on a legacy AV. FS01 = File Server 01. Not it's real name, nor was it exclusively a file server.)

    $Me: "The one server you cannot touch from your console. Please call $SrSysAdmin1. If you can't get ahold of him try #2 and #3. If you can't get ahold of either, call me. Don't call the listed management, they're out of the country at the moment on vacation."

    $MSSP: "Noted. Thanks!"

    So I get logged into AD and kick the 6 users off. Reset their passwords, etc. I also brought down the VPN interface entirely. I set my focus on the server that had a shell (detected/prevented) on it. Now, this server was ancient. 'Ancient' actually means ancient. It would be legal age (18+) today if it were still running. If you haven't figured it out yet, a physical box running Windows (Storage Server) 2003. It was a huge liability but anytime someone, regardless of seniority even thought about taking a tablesaw to it, they'd get told off.

    Now, being familiar with how pentests work, I knew they'd probably already dumped SAM and LSASS, making this a huge problem. Just as I cut the server off (that would've normally gotten someone fired), I get a call back from the MSSP.

    MSSP: "We weren't able to reach any of the Senior Sys Admin team."

    $Me: "Forget the admin team, I'll handle it. FS01 is offline. Run a third search, all admin authentication activity from the last 24 hours. If these guys had any brains they dumped the SAM database and LSASS processes to get credentials."

    $MSSP: "I'll send it over shortly. Just as a heads up, we have a couple Senior Security Analysts online. They're pouring over all your logs to find any other movement. They don't appear to have run any AD enumeration tools."

    $Me: "Awesome. Who are the seniors on your side right now? I'll coordinate how we handle FS01's recovery with them."

    $MSSP: "SrAnalyst1, 2 and 3."

    $Me: "Noted. I need to make some phone calls to Dispatch, time to get chewed out, and wake the seniors up on our side."

    So I call dispatch. At dispatch they have special procedures for special requests... like critical IT staff getting police escorts back to IT HQ.

    $Dispatch: "9-1-1, where's your emergency?"

    $Me: "AV636 calling, priority. I need police dispatched to Sr Sys Admin 1, 2 and 3's houses immediately. Escort them to IT without delay. Unresponsive to phone calls. 6 user accounts compromised, with one critical server offline. Attackers are suspected to be the same group that hit $NearbyCities"

    $Dispatch: "Police will be dispatched. Was the critical asset FS01?"

    $Me: "Yes. Trust me when I say the alternative was much worse."

    $Dispatch: "We need that server, now. Staff are already complaining."

    $Me: "I understand what it's used for, but forget it. If I release the server back to it's normal operations I could very possibly cause the release of a threat internally. We will release it if, and only if, we can reasonably claim it's safe."

    $Dispatch: "Fine. Police will arrive in a minute or two. You will get a call from IT staff once they're awake and in the patrol vehicles. Rest assured they will not be pleased that FS01 is offline."

    $Me: "You're absolutely right - they won't be pleased about it. But they will definitely understand. Thank you for your assistance."

    $Dispatch: "Keep us informed."

    A couple minutes later my phone rings, it's SrSysAdmin1.

    $SSA1: "Hey. What's going on?"

    Me: "Our Russian friends broke the front door down. 6 accounts compromised through the full tunnel VPN. I took the VPN interface offline on the firewalls. FS01 is offline following the detection of meterpreter shells."

    $SSA1: "Fuck! Dispatch is going to be pissed. Have they gotten anything yet?"

    $Me: "No ransomware yet. *knock on wood* But the compromise of FS01 means that they likely dumped SAM and LSASS at the very least. Which means we should be at least resetting all admin passwords, and likely mass-resetting all domain passwords in general. (FS01 was a file server, jump box, and used by Public Safety). It'll be a lot of very pissed off users tomorrow, but it's a sure fire way of making sure they don't have additional credentials. It's also hard to say how many VPN accounts they have."

    $SSA1: "Agreed. What about FS01? Dispatch will want that server online asap."

    $Me: "They've been told to pound sand. This is the kick in the ass that they and management need. Time for FS01 to go. We got LUCKY that it was caught early, and someone got to it in time from our side. I'm damn inclined to say we can never fully trust it and transition to something on newer hardware, OS, software, and protected by the EDR."

    $SSA1: "Playing with fire - I like it. Kick sand they will, management is out, and I'm next senior. I'll play dumb and just 'authorize' it in the frenzy."

    $Me: "SSA2 and SSA3 will be here shortly. Let's throw up something modern, Server 2019, patch in the EDR, get the software interfaces for dispatch/public safety online by the end of the week. Dispatch can run both manual and the new system over the weekend. We won't have any issues getting SOC to stretch an investigation out for the next few days in the interest of security."

    $SSA1: "I'll have that conversation with Dispatch. They're going to lose it when I tell them it's not coming back. Reach out to SOC's Seniors, get a list of every possible reason we can viably use to stretch out the downtime of FS01, just in case management wants a trail. Follow it up with a call to software vendors to get their support."

    $Me: "Sweet. I'll get the PW resets underway using scripts. Then I'll start calling vendors."

    So we get the passwords reset for everyone. Yes, we had hundreds of pissed off users the next day. (Oh well...) SOC gave us a stupid long list of things to check. Naturally, it took 'days' to confirm the server was safe for use. By which time the new server was up and running. Dispatch was furious with us.

    IT Management found out the next day of what happened, along with the fact that FS01 would be offline for "extensive review". A few days later we told management that FS01 was being replaced with 2 dedicated servers for dispatch/public safety. Both would run side-by-side. Two ran so that we could take one down for standard maintenance without interrupting their service. The file share side of FS01 for corporate was replaced by a significantly more modern server with faster/expanded storage. Jump box was moved to a dedicated jump box.

    Sometimes you have to piss off users (or as they're called 'customers'). On the other hand,

    1. Proper segregation of Corp and Public Safety. The last of the tied interfaces fell offline with FS01 kicking the metaphorical can.
    2. VPN had MFA bought and enforced very quickly after that.
      1. Also a full audit of VPN permissions. Anything that had not been active in the last month was removed.
    3. The "purse strings" loosened significantly immediately following the breach.
      1. (That's interesting... Normally when something is prevented, some asshat in C-Level or Council determines "oh, they have enough money!")

    Nightmare: We were probably less than an hour away from losing FS01, followed by most of our environment. Multiple compromised accounts.

    Greatness: It was the kick in the ass people needed to remove it permanently from service.

    ---------------------

    Edit: Thank you everyone for the various awards! I'm not the best at recounting events and telling stories. But it means a lot!

    submitted by /u/Advanced_Vehicle_636
    [link] [comments]

    Sometimes I love my colleagues for teaching me new stuff through their "weird PC errors"

    Posted: 03 Nov 2021 08:32 AM PDT

    Non IT employee here, but knows how ti Google and isn't scared in trying finding solutions himself instead of calling IT asap.

    I am sitting in my office and hear a colleague from the other room scream something like "what's with this shitty PC?!"

    I went over and she shows me her screens. Everything is enlarged. Looked weird. I offered her to have a look. First I unplugged her additional monitor's and used the laptops screen. Maybe something with the docking station was wrong. Nope. Then I decided to go to the settings and check the settings. My idea was that she used a windows shortcut by mistake. So I moved the mouse and noticed that the screen was moving. Then I noticed that the magnifier was open and set to 200%. I zoomed out and everything was fine. Apparently she clicked the windows button and plus button at the same time which zooms in. Didn't know that shortcut and neither that win+Esc closes the magnifier.

    submitted by /u/Tischlampe
    [link] [comments]

    Anyway, I pressed Reset.

    Posted: 04 Nov 2021 01:55 AM PDT

    Found this sub yesterday, glad I can move away from the ComputerStupidities I keep reading and confess part of my days.

    The guy left, and anyway I don't care about what he would say, so:

    Lunch time, was at the refectory. That user came by and quietly told me "Computer froze like five times, I pressed reset every time and now it doesn't boot". I know this guy does not like working too long (as many out there), so if he tells me at midday then the machine was down since the early morning (or the yesterday evening).

    No ticket yet and my lazy ass down, I went by after 1:30pm. Sure enough, booting leads to a black screen, not even a bios.

    This machine was fairly recent : 2015, brand new PCI/ISA backplane (great!), a shiny parallel card (love it!), two frontal USB ports (awesome!) and **even** a motherboard with SATA support (incredible!) with a platine-plated SSD with its antivibration kit (what?) for our trusty Windows XP. Yes 'we' love dead technologies.

    No way to boot this thing back, so I took one spare motherboard and started to change it on site (because this darn thing was really difficult to take out); this is the king of MB with jumpers everywhere and original ones were hot-glued, highly maintainable. Nope, does not work: BSOD on startup (thanks XP for keep rebooting while BSOD, had to google how to stop it).

    *User: So?

    *Me: Well, by resetting it looks like you finished the motherboard, and the hard drive too.

    *User: When this will be fixed? Because I'll need to tell it at the morning meeting.

    *Me: I think tomorrow morning.

    I went to take the necessary tools to take off the PC and replace entirely with a spare because that was the easiest way before fixing the old ones, I found the maintenance coordinator and told him about the "press reset to kill" thing. He was pretty fine about the troubleshoot. I put the PC on the workbench and transferred the cards into the new machine and restored a backup into the new disk. Looks fine, went back home.

    Tomorrow came. I had to take some pills so I was stoned; I usually don't really care about what peoples I don't like says, so it was even worst. I put the new machine and started plugging everything (all hail profibus, eight 9d-sub and many outdated things) when the guy went angry enough to yell at me.

    *User: Hey you told that was MY fault because I reset multiple times BUT I told you I reset ONLY ONE time and your job is repairing not blaming (insert angry noises).

    A random guy which loves to blame everyone came to support his point. I was like "yeah go for it, life's good" anyway. For unknown reasons an another guy went into corpse elimination, they quickly calmed down when I started explaining everything I knew. I'm a fun guy I swear.

    Restored disk didn't work, so I took a look on the BSOD error and, for once, I found an answer: wrong bios config for reading the disk. Had to change something I don't remember, so the old disk was fine. I claimed it has dead anyway, no one would care. It came back to life.

    Reconfigure the address of the parallel port to detect the dongle, yell on Windows because old COM ports were hidden so new config started at COM9 instead of COM2... Everything was working except the network (new MAC address).

    Because I was done with this I directly asked to the high-level guy to make the emergency request for network declaration (I don't have access to that for.) and this **emergency** request took three days and like five calls to get resolved. Finally everything went back on.

    Pfiou.

    Sorry about the terrible English, btw.

    submitted by /u/Sewef
    [link] [comments]

    Remote access doesn't work. And now it's broken too!

    Posted: 03 Nov 2021 12:42 PM PDT

    Like many of you, I switched to remote work in 2019. Unlike many of you, I had no idea how the heck to set that up... But hey, I was trapped at home with nothing better to do and getting paid to figure it out. It took a few days of research and cursing to cobble a system together, but in the end it worked. Most of the time. In fact it worked so well okay that my boss tasked me with setting everyone else up on it too. And that's when the problems started.

    See, it just so happened my workplace and I had the same ISP, so it was very easy for our systems to communicate despite all the NAT involved. For everyone else I used some spare cycles on a VPS I was renting to relay all that traffic instead. Things worked great once I set that up. For everyone except Steve anyway...

    For months, Steve made a point of telling me remote access didn't work for him any time we were both in earshot of our superiors. And every time I had the same response: "Let me have a look at your laptop." But that laptop never came. I'm not going to lie, not only was I totally stumped by what the problem could be, I also stopped caring about it. Winter turned to spring, spring turned to summer and yet Steve's problem remained.

    One day I noticed an unusual charge on my credit card; I had forgotten to cancel the VPS I was renting. Steve and I were the only people who would be using it by that point, but I didn't need it and he kept telling me it didn't work anyway. So with a conscience whose cleanliness can only be the result of a faulty memory I cancelled that service.

    24 hours later Steve tore into my office, frothing at the mouth. "Remote access doesn't work anymore! It's totally broken!"

    "Yeah, yeah, I know... You, me, laptop, et cetera."

    "No, it did work, but now it's totally broken and I can't do my work now!"

    My ears immediately perked up and I did my best to project a halo of innocence round my head once I realized why it'd stopped working. "I'm quite certain you told me it didn't work at all..."

    "Just the one thing! Everything else was fine!"

    "Ohhh?"

    "Please, fix it right away!"

    "Sure thing, Steve. Let me have a look at your laptop."

    submitted by /u/YDAQ
    [link] [comments]

    Try turning it off and on again…

    Posted: 03 Nov 2021 08:07 AM PDT

    So, a while back a buddy of mine had a problem with his school-issued chromebook, where he couldn't sign in to any of the services the admin had set linked to his school issued account. About three weeks after he submitted a tech request, and two remote administration sessions where they tried to fix it, and three account resets (our tech team only helps students remotely, they got tired of "why my chromebook no turn on? You need a charger." Every ten minutes) I got tired of him complaining about it and started messing with it in person. After an hour of messing with settings, looking through logs and signing in and out, I asked him if he'd turned it off and on again since he had the problem. He said yes. A light begins to dawn. "How did you turn it off and on again? I closed the lid and then opened it." No. You put it to sleep. Repeatedly. I turn it off and on again and it works perfectly. Big brain moment : p. Still no clue what the problem actually was, pretty sure it just didn't see the credential changes until it actually powers off and on again /shrug

    Edit: pretty sure this is my most upvoted post : P

    submitted by /u/Nhazmat
    [link] [comments]

    Fun with phone lines

    Posted: 03 Nov 2021 07:09 AM PDT

    So, yesterday's fun: I was in the office taking care of a few things (new switches for a remote office mostly) and I was asked to look at a phone line. Specifically the phone line used by the alarm system.

    The alarm company says the line doesn't work and they can only connect over the LTE backup. This has been an ongoing issue.

    I go in there with my butt set and check - yup there's dial tone, just like every other time I've checked it for them.

    BUT - and this is a big but (but not as big as mine) - it's also on the main fax line. What?? Who are me Mom&Pop Inc. with two lines? We have at least 4 1FLs that are just sitting unterminated on a BIX block in some random riser room.

    So I go about figuring out how this line gets here (it's in the main electrical room - there's 50,000 volts behind me). I don't make it 18" before I find a splice (it comes out of a multi-pair cable - maybe 6 pair?) then goes into another conduit. Trace the conduit - it comes out of there and is split into an old telecom screw terminal with a thousand other wires, most going nowhere, then out of that to another splice a foot away and into another conduit leading into the Bell room.

    Into the Bell room I go, and find the conduit popping out right behind some massive metal box that the cable company managed to find an inconvenient place for. Trace the cable to another screw terminal where it goes down to an ancient jack on the wall, then back to the demarc via at least two BIX terminations.

    So ok, I put my butt set on the BIX terminal that I REALLY need the line to work (ie: the thing that actually receives faxes) then listen as I disconnect the spliced-a-million-times cable from the nearest screw terminal. I still have dial tone, so I didn't break the FAX. Hooray!!!

    I tried to pull the old cable out of the conduit so I could use it for a replacement cable, but it's very stuck. Oh well, it can just live there. I found a new path where I can run a simple four pair cable, terminate that on a properly labeled BIX block, put an unused line on it them terminate the other end directly at the alarm panel.

    It works, but the alarm is still complaining. At least we know it's their problem now.

    BTW: I think the reason for all the splices is that this is something like our third alarm/access control system. The old ones never seem to get entirely removed. There are several empty boxes on the walls with remnants of cables and abandoned-in-place hardware. So, every time we got a new system it the phone line was just spliced to the new box.

    I guess next I should pull out all the rest of the dead gear, but who has the time?

    UPDATE: Turns out I did break the fax line. There's now a strong 60Hz hum that's keeping the fax from syncing. I can't hear it over the dial tone, but if you listen in on an actual fax it's definitely there and very loud. Looks like I'm pulling out more cable tomorrow.

    submitted by /u/theservman
    [link] [comments]

    No comments:

    Post a Comment

    Fashion

    Beauty

    Travel