
Commander Joshua Wallace doesn’t talk much about titles or ambitions. Twenty-seven years into his career with the Chicago Police Department, he’s built a reputation on something else entirely: showing up, doing the work, getting it right.
“The shift from hierarchical criminal organizations to decentralized networks has fundamentally altered investigative architecture in several critical ways,” Wallace says, sitting in his office within CPD’s Bureau of Counterterrorism, where he oversees the Criminal Network Group. It’s the kind of statement that might sound academic coming from someone else. From Wallace, it carries the weight of two and a half decades spent watching Chicago’s criminal landscape evolve, often in ways that left traditional policing strategies obsolete.
Commander Joshua Wallace graduated from the FBI National Academy and the Senior Management Institute for Police. He commanded drug investigations and supervised tactical operations. He’s been a finalist for multiple chief of police positions nationwide but those credentials matter less than what they represent: a career built on adaptation, on recognizing when what worked yesterday won’t work tomorrow.
The work has never been straightforward. Chicago hasn’t made it easy but Wallace has stayed, navigating personal and professional challenges with what colleagues describe as uncommon integrity. He’s earned trust by being willing to lose it, by making decisions that weren’t popular but were right.
His approach centers on fairness, responsibility, and building trust between law enforcement and the communities it protects. In an era defined by consent decrees and federal oversight, Wallace has embraced transparency as operational necessity rather than bureaucratic burden.
Commander Joshua Wallace knows the old model is dead. “Decentralized networks don’t respect boundaries,” he explains. “A drug trafficking organization operating in your district may have supply lines spanning three states and coordinate via encrypted apps hosted overseas.” The investigation can’t be local anymore. It requires federal partnerships from day one. FBI, DEA, ATF. Access to fusion centers. Multi-agency task force integration built into the investigation’s foundation, not added later when local efforts stall.
The shift demands comprehensive intelligence mapping before deciding where to intervene. Identify the ecosystem, who the nodes are, how they connect, the actual structure. Only then can investigators determine intervention points. This requires front-loading analytical resources rather than building cases linearly, a fundamental restructuring of how investigative teams operate.
Technology isn’t a tool anymore. It’s core infrastructure. “Decentralized groups rely on technology for coordination,” Wallace says. “Encrypted messaging, cryptocurrency, dark web marketplaces, burner phones.” Investigations must integrate digital forensics, cyber capabilities, and signals intelligence as foundational elements. The investigative team structure itself needs embedded technical skill, not someone you call when you hit a dead end.
Enhanced operational security becomes mandatory. Decentralized groups operate in cellular structures where members don’t know each other. Investigative teams must mirror this with tighter operational security, compartmented information sharing, assuming targets have counter-surveillance capabilities through social media, scanners and potentially compromised sources.
Commander Joshua Wallace watches for warning signs constantly. The drift from legal and ethical foundations happens slowly, then suddenly.
“One of the earliest warning signs appears in the handling of informants and sources,” he says. Officers become overly protective about source identities beyond what confidentiality requires. The same confidential source appears on multiple warrants but never testifies or surfaces at hearings. Officers resist supervisor review of source files. Documentation becomes vague.
The relationship between handler and source creates loyalty that overrides judgment. Wallace has seen it before. The handler believes in the source, trusts them, starts building cases around what the source provides rather than corroborating independently.
Documentation degradation follows. Report quality declines. Officers write search warrant affidavits that are formulaic, templated rather than fact-specific. “When you see copy-paste language across multiple cases, that’s not efficiency,” Wallace says. “It’s manufacturing probable cause.”
Results-driven rationalization emerges when “we know he’s dirty” justifies shortcuts. Officers celebrate stats without scrutinizing how they were achieved. Teams start talking about targets by asset value rather than criminal impact, accounting gets murky. “Money corrupts faster than almost anything else,” Wallace says.
Officer isolation manifests as an us-versus-them mentality excluding the rest of the department. “When officers begin viewing oversight mechanisms as obstacles rather than safeguards, you’ve lost the foundation,” Wallace says.
Constitutional corner-cutting shows up in Terry stops that are really arrests, consent searches that aren’t voluntary, protective sweeps that become searches. The erosion occurs when officers learn to articulate lawful justifications for actions they’ve already decided to take.
Wallace monitors personal leadership indicators too. When you make excuses for your people rather than holding them accountable. When you’re defending decisions to outsiders that you wouldn’t defend internally. “These are signs that you’re participating in the drift rather than preventing it,” he says.
The tension between speed and accuracy defines modern intelligence work. Commander Joshua Wallace structures his response around consequences.
“Not all intelligence requires the same level of verification before acting,” Wallace says. Tactical intelligence that will result in a search warrant, an arrest, or a raid demands rigorous corroboration and documentation. Intelligence that will inform patrol deployment or surveillance priorities can move faster with lower verification thresholds. The key is matching quality control to the consequences of being wrong.
Wallace structures this through decision gates based on contemplated action. Putting eyes on a location or adjusting patrol patterns requires lower intelligence bars with rapid approval. Seeking a warrant or planning a tactical operation demands multiple verification layers regardless of time pressure. This prevents the dangerous situation where urgency justifies cutting corners on high-consequence decisions.
Building speed through preparation matters critically; standing protocols, pre-approved surveillance authorities where legally possible, template packages for common warrant types requiring only factual details, established relationships with prosecutors who can move quickly when intelligence is solid. Speed comes from eliminating administrative friction, not reducing verification rigor.
Team structure and role specialization help. Analysts rapidly process raw intelligence and identify patterns. Validators provide quality control before intelligence is operationalized. Separating these functions prevents the natural tendency to see what you want to see under pressure. The analyst develops the lead, the validator stress tests it, and the command makes the decision on action.
“The cultural dimension of how your organization treats intelligence errors shapes everything,” Wallace says. If the response to a bad lead is punitive, analysts will slow everything down to achieve perfect certainty, guaranteeing missed time-sensitive opportunities. If errors are treated as learning opportunities and the standard is a reasonable process rather than perfect outcomes, people move faster with appropriate confidence.
Mandatory cooling-off periods for certain intelligence-driven actions prevent momentum from overwhelming judgment. Pre-operation briefings serve similar functions. When the tactical team, the analysts, and the case officers all explain the intelligence basis for an operation, inconsistencies or gaps become visible. Team members ask questions from different perspectives, and that collective scrutiny often catches problems.
Commander Joshua Wallace believes most crises are failures of earlier intervention. By the time something is a full-blown crisis, you’ve missed multiple opportunities to address it when it was manageable.
The crisis that forces your hand is usually the culmination of avoiding accountability when it would have been easier.
Communication failures amplify every problem. “The lesson is that during a crisis, you communicate even when you don’t have complete information, even when the news is bad, even when you’re not exactly sure what happened yet,” Wallace says. Silence breeds conspiracy theories.
Defensive reactions make everything worse. The instinct during a crisis is to defend your people, defend your decisions, explain why critics don’t understand. But defensiveness prevents learning and confirms suspicions that you’re unwilling to acknowledge problems. Wallace has learned to separate supporting people from defending bad actions. You can back officers who acted in good faith while acknowledging that outcomes were terrible and investigating thoroughly.
Trust built during calm times determines capacity during crisis. If communities don’t trust you before the crisis, they won’t trust you during it. If officers don’t believe in your leadership on a day-to-day basis, they won’t follow you when things are chaotic. A crisis reveals the strength or weakness of relationships built long before.
“The cost of delayed acknowledgment is exponential,” Wallace says. Every day you delay admitting a problem, acknowledging a mistake, or taking corrective action, the cost to credibility rises. Early, honest acknowledgment of problems, even when painful, prevents catastrophic loss of trust.
Organizations remember how you led during a crisis long after it ends. Officers watch whether you protected them or threw them under pressure. Communities watch whether you were transparent or defensive. The organization watches whether you maintained your values or abandoned them for expediency.
The question Wallace always asks when something goes wrong is what he could have done six months ago, a year ago, three years ago to prevent this. That backward-looking analysis informs forward-looking prevention. Most crises have warning signs that were ignored or rationalized. Learning to see and act on those signals is the real lesson from failure.