We have all seen the movies where a spy is this highly trained operative in a tuxedo, or maybe someone wearing a trench coat in a dark alleyway handing over a microchip. It is a classic image, but it is also completely outdated. Today, the person helping an adversary might just be a guy with a smartphone who wants to make fifty dollars in cryptocurrency for taking a photo of a hole in the ground. It is mundane, it is transactional, and it is happening right now on a scale we have never seen before.
I am Herman Poppleberry, and that shift from the professional operative to what people are calling the gig economy spy is one of the most significant changes in intelligence gathering we have seen in decades. It is decentralized, it is low cost, and it is incredibly difficult to stop because it hides in plain sight. We are moving away from the era of the sleeper agent and into the era of the human sensor.
Today's prompt from Daniel is about this exact evolution. He wants us to look at how the Islamic Revolutionary Guard Corps, or the IRGC, is leveraging platforms like Telegram and using crypto payments to turn ordinary civilians into unwitting intelligence assets. It is a trend that started in Israel but is now spreading across the Gulf, affecting countries that previously thought they were insulated from this kind of infiltration.
The timing on this is critical. Just a few weeks ago, between March third and March fifth, twenty twenty-six, we saw Qatar announce the arrest of ten suspects in two different cells linked to the IRGC. This was a massive story because seven of them were focused specifically on espionage against military and vital facilities, while three others were being trained for sabotage, including drone operations. This is Qatar, a country that usually has a fairly functional relationship with Iran as a key interlocutor, yet even they are seeing this kind of deep-seated infiltration.
It really highlights that this is not just about one specific conflict or one border. It is a new playbook for regional intelligence. Daniel mentioned that there is a lot of information to be gained from people who just happen to be in the right place at the right time. We are talking about battle damage assessment, or BDA, which is something we have touched on before, but the human element here changes the math entirely. When you can crowdsource your eyes on the ground, you do not need a satellite constellation.
The technical challenge of battle damage assessment is that satellites, while amazing, have limitations. You might get a high-resolution image from a satellite showing a strike site, but that image is a snapshot in time, often delayed by orbital mechanics or cloud cover. It might not tell you if the missile hit the actual target or just a decoy. It might not show you the specific debris from an interceptor, which could tell an adversary which version of the Iron Dome or David's Sling was used.
And that is where the human sensor comes in. If you have someone on the ground who can walk up to a crater and take a high-definition video from three different angles, that is worth more than a dozen satellite passes. You can see the angle of entry, the type of fragmentation, and even the reaction of the local emergency services. You can see if the "hit" on the satellite image was actually a fire started by a fallen interceptor rather than the warhead itself.
There is a massive psychological barrier that this model bypasses. If an intelligence officer approaches you in a bar and asks you to steal classified documents, you know you are committing treason. You know the stakes. But if a random account on Telegram offers you a bit of Bitcoin to take a photo of a military truck or a construction site near a base, it feels mundane. It feels like a side hustle. It feels like gig work, no different from delivering food or driving for a ride-share app.
We talked about this in episode eight hundred eleven, the gig economy of treason. Back then, it felt like a series of isolated incidents, mostly focused on low-level recruitment. But the data from twenty twenty-five shows it has exploded. Israel saw a four hundred percent increase in suspected espionage cases involving their own citizens last year. Most of these people were not ideological supporters of Iran; they were just people looking for a quick payout, often recruited through "job offer" groups on Telegram.
The financial infrastructure behind this is staggering and provides the "why" behind the scale. TRM Labs recently released data showing that the IRGC has moved over three billion dollars in cryptocurrency since twenty twenty-three. About one billion of that went through exchanges registered in the United Kingdom. They are using this massive pool of digital assets to fund thousands of tiny, low-stakes operations. When you are moving billions, a fifty-dollar payment to a guy in Haifa or Doha is just noise in the system. It is statistically invisible to traditional financial monitoring.
It is the decentralization that makes it so resilient. In the old days, if you caught the handler, you rolled up the whole network. You followed the money trail back to the embassy. Now, there is no network in the traditional sense. There is just a Telegram bot or a faceless handler and a thousand different people who do not know each other, each performing one small, seemingly insignificant task. How does the IRGC even manage that much noise? If you have ten thousand people sending in photos of craters, how do you find the signal?
That is where the modern intelligence center comes in. They use AI-driven image recognition to sort through the thousands of submissions. They can geolocate a photo based on the skyline or a specific street sign in seconds. They are essentially running a massive data-processing operation where the "employees" are the unsuspecting public. This connects back to what we discussed in episode seven hundred seventy-nine about wartime operational security in the digital age. We have moved into a period where everyone is carrying a high-definition sensor in their pocket. The enemy does not need to build a surveillance network; they just need to rent yours for the price of a cup of coffee.
I find it interesting how this mirrors historical patterns, even though the technology is new. You mentioned the World War Two campaigns earlier, Herman. It feels like we are repeating the same mistakes, just with better cameras.
The "Loose Lips Sink Ships" posters from the nineteen forties are the most famous example. Back then, the concern was that a dockworker might mention a ship's departure date at a bar, and an Axis spy would overhear it. The intelligence was gathered through casual observation of mundane details. The difference today is the speed and the reach. A photo posted to a private Telegram group or even a public Twitter feed can be in an operations center in Tehran within seconds. We have replaced the "barroom overhear" with the "digital upload."
There is also the Cold War parallel of the "illegal" agents or stay-behind networks. The KGB would have people whose only job was to live a normal life and watch things like bus schedules near military bases or shift changes at factories. If the buses started running more frequently, it suggested a mobilization. That is pattern-of-life data. Today, the IRGC does not need to plant a guy for ten years to watch a bus stop. They just need to wait for a soldier to post a "getting ready for work" selfie on Instagram.
And now that pattern-of-life data is being generated voluntarily by civilians and even soldiers. Look at the IDF's digital security nightmare from May twenty twenty-five. Despite all the warnings and the post-October seventh reforms, you still had soldiers posting videos on Instagram or TikTok that revealed their unit locations or the timing of mass gatherings. In one case, a single post revealed enough information for an adversary to map out the entire defensive perimeter of a base. It was not a spy who did that; it was a nineteen-year-old who wanted to show his friends what his morning looked like.
It seems like the biggest hurdle for counterintelligence is distinguishing between the recruited agent and the person who is just being careless. If I see a cool-looking explosion and post a video of it to social media, am I an unintentional intelligence asset or a paid spy? From the perspective of the IRGC, does it even matter?
From their perspective, the result is the same. The data is the data. But for a legal system, the distinction is everything. That is the core of the tactical problem. Most legal systems are built around proving intent. If you cannot prove that the person was communicating with a foreign agent or receiving payment, it is very hard to prosecute them for just taking a photo in a public place. The IRGC knows this. They tell their recruits to act like war tourists or hobbyist photographers. They give them "covers" that are perfectly legal. "I am a researcher looking for photos of local architecture," or "I am an investor interested in the progress of this construction project."
How do you even begin to thwart that? You cannot ban everyone from having a phone near a strike site, and you cannot arrest every person who takes a photo of a sunset that happens to have a radar array in the background.
The shift in counterintelligence has to be toward denying the data rather than just catching the person. It becomes a public awareness problem. You have to convince the population that their social media feed is a battlefield. In Kuwait and Bahrain, they have already started arresting people for filming military movements and sharing them online, regardless of whether they were paid to do it. They are trying to create a culture where filming a missile battery is seen as a hostile act, not a social media opportunity. They are trying to re-establish the "Loose Lips" mentality for the TikTok generation.
It is a tough sell in a world where everyone wants the likes and the engagement that comes with being the first to post a viral video. We are essentially asking people to ignore their natural impulse to share what they see. And it is not just the individuals; it is the OSINT community too.
This is what I call the OSINT inversion. Open Source Intelligence tools like Sentinel Hub or flight tracking websites are used by analysts to keep the public informed, which is great for transparency. But those same tools provide the baseline for adversaries. They use the public OSINT reports to see what the world knows, and then they use their gig economy spies to fill in the gaps that the satellites and the public trackers missed. It is a feedback loop. The OSINT analyst says, "It looks like a strike hit this coordinate," and the IRGC sends a guy on a moped to that coordinate to confirm if the building is actually destroyed or if it was a decoy.
So if an OSINT analyst on Twitter says a strike hit a specific warehouse, the IRGC uses that as a prompt for their gig workers. "Hey, can anyone in this neighborhood go check out the warehouse on Third Street? Fifty dollars in Bitcoin if you get a video of the roof."
It is about verification. We saw this in episode eleven hundred ninety-three when we talked about information attrition. Even if a missile barrage fails to hit its primary targets, the adversary wins if they can force the defender to reveal the locations of their interceptor launchers. If a civilian takes a photo of an Iron Dome battery firing, they have just given away a high-value coordinate that was previously hidden. The IRGC does not care if the missile was intercepted; they care that they now know exactly where the launcher is located for the next wave.
And once that coordinate is out there, it is out there forever. You can move the launcher, but you have revealed the tactical pattern of where those launchers are placed. You have revealed the "logic" of the defense.
There is a case from August twenty twenty-four where Israeli soldiers at a top-secret base were posting photos that revealed the timing of shift changes. To the soldiers, it was just a photo of them having coffee at sunset. To a professional analyst, it was a roadmap of when the base was at its most vulnerable, when the gates were open, and when the guards were most likely to be distracted. This is the "mundane photo" problem. The most valuable intelligence is often the stuff that looks the most boring to a layperson.
It feels like we are living through a massive failure of digital hygiene. We have all this power in our pockets but very little understanding of the second-order effects of using it. We think we are sharing a moment, but we are actually sharing a target.
The IRGC is counting on that lack of understanding. They are exploiters of human nature. They know that people are curious, they know that people like money, and they know that people do not think their individual actions matter in the grand scheme of a war. They rely on the "it is just one photo" fallacy.
That is the misconception we need to bust. People think, oh, it is just one photo of a crater. But when you aggregate ten thousand of those photos, you have a real-time, high-definition map of the entire conflict. You have a BDA that is more accurate than anything the US or Russia could produce during the Cold War.
It is the difference between a single pixel and a full-screen image. Each person is providing one pixel. They do not see the whole picture, and they do not feel like they are contributing to anything significant. But the person at the other end of the Telegram bot certainly sees the whole picture. They are the ones assembling the puzzle.
So what is the takeaway for someone living in a conflict zone or even just near military infrastructure? How do we protect ourselves from becoming an unwitting sensor?
The first thing is to realize that there is no such thing as a mundane photo in a war zone. If it shows a soldier, a vehicle, a piece of debris, or a strike site, it is intelligence. Full stop. If you see something, do not post it. The second thing is to be incredibly suspicious of anyone offering money for simple tasks involving photos or locations. If it feels like a weird side hustle, it probably is. There is no legitimate reason for a "researcher" on Telegram to pay you fifty dollars for a photo of a construction site near a military base.
And for the governments, it seems like they need to move beyond just arresting spies. They need to build better systems for reporting these recruitment attempts. If the recruitment is happening on Telegram, the counter-offensive needs to happen there too.
They are trying. The Mossad and the IDF have been running campaigns to show people what these recruitment messages look like. They often start very casually, building rapport over days or weeks before they ask for anything sensitive. They might ask you to do something totally legal first, like taking a photo of a public park, just to get you used to the process of receiving payment. It is social engineering on a national scale.
It is a long-game approach. They groom the "sensor" before they use them.
Precisely. And the use of cryptocurrency makes it almost impossible to track the individual payments. When you are moving three billion dollars through five thousand different addresses, a fifty-dollar payment is a needle in a haystack of needles. The IRGC has built a financial engine that is perfectly suited for this kind of micro-espionage.
I think we are going to see this model spread to every major conflict. Why risk a high-value operative who took years to train when you can just crowdsource your intelligence for the price of a few thousand cups of coffee? It is the ultimate low-risk, high-reward strategy for an intelligence agency.
It is. Even if the person gets caught, the agency has lost nothing. They have no "skin in the game." They just move on to the next person in the Telegram group. The "spy" is a disposable commodity in this new economy.
It is a sobering thought. The battlefield is not just where the soldiers are; it is everywhere there is a cell signal and a person with a bank account or a crypto wallet. We have moved from the era of the spy to the era of the sensor. And whether we like it or not, we are the sensors.
It is a fascinating and slightly terrifying evolution of the human element in warfare. We are all part of the intelligence cycle now, whether we intend to be or not.
That is a perfect place to wrap this up. We have moved from high-stakes espionage to the gig economy of treason, and the implications for operational security are just massive. It is not just about catching the bad guys anymore; it is about teaching the good guys how to stop being accidental assets.
And that starts with understanding that your smartphone is a weapon system if you use it the wrong way.
Thanks as always to our producer Hilbert Flumingtop for keeping the gears turning behind the scenes and making sure our own digital hygiene is up to snuff.
And a big thanks to Modal for providing the GPU credits that power the generation of this show.
This has been My Weird Prompts. If you want to dive deeper into our archive and see how we have tracked this trend over the last few years, head over to myweirdprompts dot com. You can search the whole catalog there and find the episodes we mentioned today, like episode eight hundred eleven and eleven hundred ninety-three.
Thanks for listening, and stay safe out there.
Catch you in the next one.