Pelican Parts Forums

Pelican Parts Forums (http://forums.pelicanparts.com/)
-   Off Topic Discussions (http://forums.pelicanparts.com/off-topic-discussions/)
-   -   AT&T sucks. Data center fire, no Service for 48 Hours!?!?!? (http://forums.pelicanparts.com/off-topic-discussions/1010443-t-sucks-data-center-fire-no-service-48-hours.html)

Neilk 10-15-2018 05:06 PM

AT&T sucks. Data center fire, no Service for 48 Hours!?!?!?
 
Local At&t Data Center had a fire and rumor has it that it may take 48 hours to get Dallas area customers back online? That s crazy. What kind of redundancy or disaster recovery plan did they have in place? Apparently a lousy one.

Link to article.

Spectrum sounds better and better, but I have heard bad things about them too. Good thing we can tether our phones, but with everyone doing the same thing, it’s been rather slow.

P.S. definitely a first world problem and thankful for what I have

KFC911 10-15-2018 05:26 PM

Could be incompetence, bad execution, or by design (too expensive to do it "right...so just roll the dice) :(....

masraum 10-15-2018 06:27 PM

It's a balance of cost vs risk. The cost to do it right was too high, so it was done right enough for most things but not a fire.

And it could have been lots of things. Maybe there was network redundancy, but inadequate power redundancy or inadequate cooling redundancy.

Either way, most residential customers aren't paying enough for them to be worried about a 2 day outage. If you want SLAs, you'll likely pay WAY more for your service and will get better service. But even then, there can be long outages. If you want as close to 100% uptime as possible, you'll have to get multiple circuits from multiple providers and have them come into your house on opposite sides of the house, and get maps of how the circuits get to your house to ensure that they don't meet up 5 miles from your house in the same fiber bundle that will eventually get hit by a truck or a backhoe or a "fire in the man hole" (I hear that one a lot, it always makes me chuckle and think of Mexican or Indian food).

stomachmonkey 10-15-2018 06:36 PM

Quote:

Originally Posted by masraum (Post 10216925)
It's a balance of cost vs risk. The cost to do it right was too high, so it was done right enough for most things but not a fire.

And it could have been lots of things. Maybe there was network redundancy, but inadequate power redundancy or inadequate cooling redundancy.

Either way, most residential customers aren't paying enough for them to be worried about a 2 day outage. If you want SLAs, you'll likely pay WAY more for your service and will get better service. But even then, there can be long outages. If you want as close to 100% uptime as possible, you'll have to get multiple circuits from multiple providers and have them come into your house on opposite sides of the house, and get maps of how the circuits get to your house to ensure that they don't meet up 5 miles from your house in the same fiber bundle that will eventually get hit by a truck or a backhoe or a "fire in the man hole" (I hear that one a lot, it always makes me chuckle and think of Mexican or Indian food).

Pretty much.

100% SLA for consumers would be cost prohibitive to offer consumers.

Spectrum rep offered me 100% SLA for my office, at $1,500.00 per month.

I think I laughed so long he just hung up.

osidak 10-15-2018 10:42 PM

I have spectrum at home - Internet in my neighborhood went out around 9pm Friday night - came back around 7 pm Sunday

DanielDudley 10-16-2018 01:05 AM

Must have been a heck of a fire.

KFC911 10-16-2018 02:39 AM

Network redundancy and data center reliability are two completely different animals imo. They are being cross bred in this thread...

stomachmonkey 10-16-2018 03:56 AM

Seems they are back up in under 12 hours.

Not terrible.

stomachmonkey 10-16-2018 03:58 AM

Quote:

Originally Posted by KC911 (Post 10217124)
Network redundancy and data center reliability are two completely different animals imo. They are being cross bred in this thread...

Two parts of the whole so perfectly fine to conflate them.

Neilk 10-16-2018 07:52 AM

I think lack of communication is what peeved me. If AT&T had said that it typically takes 12 hours to get back online, I would have been fine with the issue, but open ended, no details and rumors it could take days was the issue.

KFC911 10-16-2018 09:24 AM

Quote:

Originally Posted by stomachmonkey (Post 10217168)
Two parts of the whole so perfectly fine to conflate them.

Respectfully disagree....redundancy to the periphery of a network is expensive and rare imo. If you paid for this level of service, and were out of commission due to a data center outage, then what? A 12 hr. DC outage (the ea$ier part imo) is freakin' huge. My former employers literally spent millions each year on DR, testing, etc. Two did it for "real" (megabanks), another International corp did it for insurance purposes only...."rolling the dice"...an expensive farce :(.

I haven't read about this particular outage though. IMO it just depends on "which part" out of mamy parts were the "cause", and failed to work as designed (or failed as designed due to cost).

I don't think about this crap anymore though...so I'm done here :)

stomachmonkey 10-16-2018 10:50 AM

Quote:

Originally Posted by KC911 (Post 10217519)
Respectfully disagree....redundancy to the periphery of a network is expensive and rare imo. If you paid for this level of service, and were out of commission due to a data center outage, then what? A 12 hr. DC outage (the ea$ier part imo) is freakin' huge. My former employers literally spent millions each year on DR, testing, etc. Two did it for "real" (megabanks), another International corp did it for insurance purposes only...."rolling the dice"...an expensive farce :(.

I haven't read about this particular outage though. IMO it just depends on "which part" out of mamy parts were the "cause", and failed to work as designed (or failed as designed due to cost).

I don't think about this crap anymore though...so I'm done here :)

Well it depends on your frame of reference.

From your side it matters, from the consumers side (the OP), they could care less. All they know is they are paying for something they are not getting and the why is inconsequential.

It was a lightning strike that caused a fire in a data center.

MBAtarga 10-16-2018 10:50 AM

The Atlanta Jackson-Hartsfield airport lost complete power one afternoon within the last year. There was a fire in one of the tunnels where power was fed that knocked out the service. Interesting thing was that the primary and secondary power cables went through the same tunnel, within INCHES of each other. Of course the fire damaged both sets. It had been deemed too expensive to route sources via different paths.

KFC911 10-16-2018 12:43 PM

I have personally experienced the "backhoe" gotcha that Steve mentioned earlier. Back in the early days of fiber....on paper it looked great...until a backhoe cut both fat trunks (different carriers, SUPPOSEDLY totally different paths through the city too...but nope :(). Money was no object...but it happened. We limped along on copper and other data centers....took daze to fix fiber back then...others learned from us....not in a good way though :)

stealthn 10-16-2018 05:09 PM

Should look at The Bunker in Woodlands, ex nuclear shelter, great guys and good pricing


All times are GMT -8. The time now is 08:38 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0
Copyright 2025 Pelican Parts, LLC - Posts may be archived for display on the Pelican Parts Website


DTO Garage Plus vBulletin Plugins by Drive Thru Online, Inc.