![]() |
Agreed. I believe gov't capabilities are over-rated. The guy in the video above is using cell phone camera sensors, for chrisssakes. I know the crap computers I worked on in the Air Force were way behind the private sector, and they were used for highly sensitive intelligence for 4 star generals.
The corporate labs have very highly paid engineers working on tech the government can't hope to match. Just my 0.02 :) |
But how fast does it do 0-60?
|
Quote:
this can see 15 square miles. So it would take 252,934 of these to guarantee they are watching me, and like pwd72s, I am one of the most boring people on earth. BUT. If I get them to focus it just right. Maybe, just maybe. I could find the dang apex to the corner. |
That's a lot of pixels yo
|
Quote:
|
Quote:
|
I should have said I won't tell you how I know.
|
Quote:
|
There are some things wrong with this explanation of this tech. Let's just do some napkin math here.
1 million terabytes a day. 24 hours in a day makes that 41666 terabytes an hour, or 11.5 terabytes A SECOND. 11.5 TB/s. I call BS. Here is an article on a publicized record for fast data transmission- http://allthingsd.com/20120516/new-terahertz-wireless-connection-faster-than-your-microwave-oven/ that is 3 gigabits per second. That is .375GB. That is 30,000x slower than needed for this video stream. Ok, so maybe they have MIMO-style multiplexing of the signal- multiple transcievers running in parallel. Still... 30,000+ of them? I'm not seeing how they are streaming this live at all. Nor storing it, even compressed. |
The sensor is damned real though. I'm just thinking that they took the thing up in a plane and recorded 30-45 seconds of video with it, and that is all they can store and process- non-real time that is.
They didn't have time and money to come up with new CCD tech, so they used commercial tech- iphone CCD's. They wouldn't have come up with new data transmission, storage, and parallel processing just for this either. |
They could use the drone to track the guy taking up two parking spaces.......
|
One thing for sure, that is not a threat in Portland. Clouds will stop that surveillance.
With our aerial mapping camera if we go to 17,500 feet AGL we will cover a about 25 square miles in one photo. We end up with a 1.1 GIG image. It is 19,700 pixels square so it is 388 mega pixel still image. We would have a 1.5 foot pixel. That is with a mapping camera that cost more than a couple of new 911 Turbos. With that image a human disappears. A car would just be a blob. To get the half foot pixel resolution they clam in the video we have to drop to 4,000 feet or so. That only covers about a square mile. And like Schumi mentioned I can't see how they transfer that much data. With just off the shelf $100 external USB3 drive it takes us many HOURS to transfer just one TB in the real world. They are doing 100 times that. I would love to see the specs on what system can stream 100 TB in a few hours or even 12 hours. And how do you store 3,000 TB per month? That would cost a fortune. |
On a side note, that big touch screen the guy is using? My buddy down the street made it. Cool little company that just got bought by Microsoft. I expect that touch screen technology will explode in coming years.
End of side note. Larry |
Shumi that's just what I was thinking, no way they could transmit that wirelessly, but the storage is doable, for about a week :)
|
Why did they come up with the acronym or name it Argus? I always related the Argus camera as a POS 126 Instamatic.
|
Cost of storage? It's the government. They have lots of money for storage and everything. They have all your money. :)
|
For those who have not read 1984, there is more to it than you may have heard. Orwell was a genius.
|
I would like to see some of the images from over 50 years ago thanks to the U-2. Supposably much of it is declassified but the CIA has made it difficult or blocked film index references.
So the gov. can zoom in on while bonking the wife or g.f. outside in your private backyard and the hacker who opens your webcam inside. Must be lots of gigglebytes stored at the NSA for some of you studs. LOL |
Quote:
Yup, except for one thing...what is stored is whats different not whats the same after the baseline is determined. If a stationary image doesn't change the AI goes back and adjusts and codes up that image - the only things saved are changes and the non-change is saved as a template of sorts till that changes...adjusts itself and purges every X amount of time. just say'n...;) |
Schumi, as per the video, their solution on the camera side is 368 phone cameras.
It would then follow that their choice of parallel architecture up on the camera side likely is replicated in some manner on the back side. I say in some manner because if you consider the data throughput of each camera alone, whatever is doing the processing can easily handle more than one camera at a time. If you do all the numbers, those numbers being kicked around are clearly based upon 0 compression. With compression, one can reduce the amount of data significantly but as you say it is still a daunting task to solve. How may this be done? Well as you point out, radio clearly doesn't have the muscle to run all out at even 15 frames a second. However transfer rates using light are much much higher and could keep up. However if I had to guess, what is actually transferred is likely dynamic. That is very high resolution single shot capability all the way down to lower resolution real-time video. Unfortunately the BD folks in these defense contractors are notorious for simply pulling crap out of thin air when it comes to specs, capabilities, and promises. |
All times are GMT -8. The time now is 01:55 AM. |
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0
Copyright 2025 Pelican Parts, LLC - Posts may be archived for display on the Pelican Parts Website