Wednesday, February 4, 2026

Standards Optional

There is a habit in American politics to reduce serious disagreements to tone, loyalty, or personal preference, as if every concern is merely a difference of opinion. That approach feels fair-minded, but it breaks down when actions touch the core principles that hold a constitutional democracy together. Some behaviors aren’t controversial because they’re misunderstood; they matter because they test whether the rules apply equally and whether human dignity is treated as non-negotiable.

Many Americans value strength in leadership. But strength is not the same as cruelty. Publicly mocking people with disabilities wasn’t simply rough humor or an offhand remark—it signaled that vulnerability is acceptable to target. Across religious traditions, conservative philosophy, and civic ethics alike, there is agreement on one point: dignity is inherent, not earned. When leaders model contempt, it lowers the standard for everyone.

Likewise, criminal convictions are not supposed to be political weapons—but they also aren’t optional. Due process exists precisely to protect the innocent and restrain the powerful. Rejecting verdicts outright because they apply to influential figures weakens the rule of law conservatives have long argued is essential to a free society. If laws apply only to those without power, equality before the law becomes an illusion.

January 6th deserves careful and honest evaluation. Peaceful protest is a constitutional right. But encouraging distrust in a lawful election and pressuring supporters to overturn its outcome by force crosses a line that conservatives historically warned against. The peaceful transfer of power is not a partisan tradition; it is a foundational safeguard. When violence in service of loyalty is excused, the precedent endangers everyone, regardless of party.

Courts exist to limit executive power—not to obstruct leadership, but to prevent tyranny. Ignoring court orders is not bold resistance; it is the rejection of constitutional checks and balances. Limited government only works when no one is placed above the law.

The same standard applies to military force. The Constitution assigns war-making authority to Congress to prevent unilateral decisions that cost lives. Acting without authorization may appear decisive, but it bypasses democratic accountability and treats human life as expendable. Conservatism has long argued that war should be rare, justified, and lawful—because its consequences are irreversible.

Transparency is another conservative value. Attempts to delay, obscure, or interfere with accountability surrounding the Epstein records raise serious ethical concerns. Abuse flourishes in secrecy. This is not a partisan issue, and protecting the powerful from scrutiny—especially when exploitation of minors is involved—undermines moral credibility entirely.

Domestically, enforcement without accountability invites abuse. Immigration enforcement, for example, does not require abandoning due process or separating families without recourse. Government power, when unchecked, inevitably harms the innocent—something limited-government advocates have warned about for generations.

Internationally, alienating allies while praising authoritarian leaders weakens stability and American credibility. Diplomacy isn’t weakness; it is a tool to prevent conflict and protect national interests without unnecessary loss of life.

At home, deploying military force against citizens exercising constitutional rights crosses a dangerous threshold. The military exists to defend the nation, not intimidate it. When dissent is treated as disloyalty, citizenship becomes conditional.

Economic policy follows the same ethical test. Cutting services for ordinary people while enabling personal enrichment is not fiscal discipline—it is a misuse of public trust. Government exists to serve the common good, not private gain.

None of this is speculative. Impeachment is not symbolic outrage; it is a constitutional mechanism designed for serious abuses of power. It exists precisely for moments when ordinary safeguards are strained.

Taken together, these issues form a pattern rather than isolated controversies. They raise questions not about ideology, but about accountability, restraint, and moral responsibility.

These are not ordinary policy disagreements. They are thresholds.

A democracy cannot survive selective accountability. Justice cannot exist when consequences depend on status. And leadership loses legitimacy when cruelty is reframed as strength.

History will not only record what happened—it will remember who excused it, who questioned it, and who refused to look away.

I know where I stand.

Sunday, February 1, 2026

The Spaces That Hold Us



Equal access is often celebrated as progress. The argument goes: if everyone can attend the same schools, live in the same neighborhoods, or worship in the same churches, why do separate Black institutions still matter? On the surface, it seems logical. But access is not the same as equity. Access is not the same as belonging. Access is not the same as home.

Being allowed in a room doesn’t mean the room was built for you. You can attend the school your parents didn’t, sit in the pews of a church that wasn’t designed with your culture in mind, or move into the neighborhood your grandparents couldn’t afford. You can do all of it and still feel like a guest. Safe? Maybe. Belonging? Rarely.

Inequity in access persists in ways that are often invisible. For example, during a year of budget cuts in Louisiana, LSU wasn’t going to get a new dorm built, while Southern University wasn’t going to get new boilers. Two public universities, both educating students and serving the state, yet one clearly received advantages in funding, maintenance, and infrastructure. Access existed in theory, but resources did not. That imbalance isn’t just a line in a budget—it shapes experiences, opportunities, and outcomes.

Black institutions were never just about access. They have always been about safety, belonging, leadership, and the preservation of culture in a world that systematically denied those things. Black churches offered sanctuary and guidance when no one else would. HBCUs created leaders and scholars in spaces where Black excellence was expected rather than exceptional. Neighborhoods, communities, and family networks provided support when larger systems actively withheld it. These spaces were—and remain—lifelines.

The same choices are framed differently depending on who makes them. When white families live in the neighborhoods they grew up in, send their children to the schools their parents attended, or worship at the churches of their ancestors, it is celebrated as tradition, loyalty, and roots. When Black families do the same, it is often labeled limiting, sentimental, or even backward. The decisions themselves do not change. Only the lens does.

Many traditions that persist in Black communities today have roots in slavery. Sunday church services, family gatherings, storytelling, soul food, quilting, dance, and communal child-rearing were originally acts of survival and resistance. Spirituals were sung under threat, often carrying hidden messages of hope and escape. Quilts were used to communicate routes to freedom. Families gathered wherever they could to reclaim connection after being torn apart. These practices were never optional—they were lifelines.

Even the painful, negative parts of history are important. Erasure of suffering is not progress. History matters in its entirety, just as a doctor relies on a patient’s full medical history to treat the present and prepare for the future. Understanding the oppression, stolen lives, and systemic barriers is essential to preserving Black spaces today. It teaches resilience, ingenuity, and community in ways that “equal access” alone cannot.

Black institutions are not relics; they are living proof that Black culture has endured, adapted, and thrived despite systemic barriers. They remind us that supporting these spaces is not about resisting integration—it is about protecting spaces that affirm identity, nurture belonging, and pass knowledge across generations.

Churches, HBCUs, neighborhoods, family traditions, and Black-led spaces exist because elsewhere was never designed to fully hold Black communities. They preserve culture, identity, and history, while teaching lessons from both the triumphs and the hardships of the past. Progress without understanding history is incomplete. Proximity without belonging is hollow.

Supporting Black institutions is not nostalgic—it is practical, cultural, and essential. Access may open doors, but belonging, culture, and home are cultivated. To abandon spaces built for Black communities in pursuit of systems that were never designed to fully sustain them is not progress—it is erasure.

Black institutions do more than preserve tradition—they preserve life, history, and the lessons that come from surviving and thriving in a world that didn’t design itself for Black people. That is why they still matter.

Saturday, January 31, 2026

The Privilege of Never Having to Prove I Belong



I didn’t study for a civics test to become an American. I didn’t wait years for paperwork to be approved. I didn’t risk my life crossing borders or leave behind everything familiar in pursuit of safety or opportunity. I was born here, and that single fact granted me citizenship. Not merit. Not effort. Just chance. The older I get, the more I understand how much of my life has been shaped by that unearned security.

Growing up, I didn’t realize how much people hated immigrants. That part came later. What I remember instead is curiosity. I remember wondering what could be so bad about places in South America or other parts of the world that people would leave everything they knew to come here. I didn’t see immigration as a threat or a problem; I saw it as a question. Why would someone risk so much unless staying was worse?

At church, immigration wasn’t discussed in political terms at all. Every week we prayed for Haiti, for Rwanda, for the Democratic Republic of the Congo, and for other African countries. We prayed for safety, for stability, for peace. Those prayers shaped how I understood the world long before policy debates did. They taught me that suffering wasn’t abstract and that borders didn’t separate people from God’s concern. No one ever suggested those lives mattered less because they were elsewhere.

As I got older, I began to realize that while I was taught to care about people beyond our borders, I never had to worry about my own place within them. I didn’t have to think about whether enrolling in school, applying for a job, seeking medical care, or traveling could put my family at risk. I didn’t carry the quiet fear that a traffic stop or workplace encounter could unravel everything. That kind of freedom settles into you so deeply that you mistake it for normal, when in reality it’s privilege.

One moment that shifted how I understood immigration in a very real way happened during jury duty. I don’t even remember what the case was about, but I remember two of the witnesses clearly. They were immigrant truck drivers, and as part of establishing credibility, the attorney asked where they were from and what they did for work. Then came the question about what they used to do before immigrating. One said he had been a math teacher. The other said he had been an art teacher. That moment stayed with me—not because it was dramatic, but because it was quietly revealing. These weren’t people lacking skill or ambition. These were people whose talents didn’t disappear when they crossed a border; they were simply redirected by circumstance.

After that, I started asking immigrants I met—drivers, janitors, warehouse workers, service staff—what they used to do before coming here. The answers were wide-ranging: teachers, engineers, accountants, business owners, skilled tradespeople. Again and again, I was reminded that immigration doesn’t strip people of ability; systems do. Credentials don’t always transfer. Language barriers matter. Survival often comes first. People take the work that’s available, not because it reflects their worth, but because it keeps their families alive.

That understanding deepened even more during the years I volunteered doing taxes. For about three years, I helped prepare returns, and we regularly worked with migrants who had ITINs. They came in with pay stubs, documentation, questions—doing exactly what the system told them to do. So when I hear people confidently say that immigrants “don’t pay taxes,” I always find myself wondering who exactly they’re talking about. Because the people I sat across from were paying into the same systems I do—often without the possibility of receiving the benefits those systems provide.

Listening to immigrant friends and neighbors over time has made it impossible to ignore how uneven our realities are. People who work full-time, pay taxes, raise families, and contribute to their communities often live with a constant awareness of what they could lose. Undocumented immigrants contribute tens of billions of dollars each year in federal, state, and local taxes, including billions into Social Security and Medicare systems they may never benefit from. The country relies on that labor and that revenue while denying the people providing it basic security. Meanwhile, I benefit without ever having to prove I deserve it.

That contradiction becomes even clearer when immigration enforcement is framed as an economic or security necessity. If enforcement were truly about reducing undocumented labor, it would begin where immigrant labor is most concentrated and most essential. States like Texas and Florida rely heavily on immigrant workers in agriculture, construction, hospitality, and caregiving. When Florida passed one of the harshest immigration laws in the country, the impact was immediate—labor shortages, crops left unharvested, stalled construction projects, and billions of dollars in projected economic losses. The lesson was unavoidable: our economy depends on immigrant labor far more than our rhetoric admits.

So when enforcement intensifies selectively, often in places where the economic fallout is smaller, it feels less like problem-solving and more like performance. Immigration isn’t being addressed honestly; it’s being used symbolically. We demand labor, restrict legal pathways to provide it, and then criminalize the people who respond to that demand.

People often say immigrants should “come the right way,” but I never had to think about a right way at all. I didn’t wait in a years-long backlog. I didn’t need a sponsor or thousands of dollars in legal fees. I didn’t have to prove my life was worthy of admission. The system I’m protected by is the same one that shuts others out, not because they lack character or effort, but because opportunity is rationed unevenly by design.

What troubles me most is how easily citizenship becomes a stand-in for moral worth. Crossing a border without authorization is treated as evidence of bad character, while being born inside one is treated as proof of deservingness. But borders don’t measure values, work ethic, or humanity. The real difference between me and someone who crossed a border without permission is paperwork and probability. I was lucky. They were not.

For those who frame this debate in religious terms, the disconnect is impossible to ignore. We prayed for the world beyond our borders while forgetting the people who crossed those borders seeking safety and opportunity. Scripture is clear about welcoming the foreigner and refusing cruelty, yet policy discussions often sound more like fear than faith.

I didn’t earn my citizenship. I didn’t prove I was worthy of it. I was born into it. And acknowledging that doesn’t weaken my place in this country—it clarifies my responsibility. If we want a real conversation about immigration, it has to start with honesty about how arbitrary citizenship is, how dependent we are on immigrant labor, and how much of this debate is driven by optics instead of truth.

I didn’t earn my citizenship. I was born into it. And that truth should change how we talk about who belongs.



Friday, January 30, 2026

Fast, Cheap, and Selectively Moral


Modern life runs on two unspoken priorities: speed and affordability. We want information instantly, products delivered overnight, food year-round, and solutions that don’t slow us down. Entire systems—economic, technological, and cultural—have been built to meet those expectations.

And yet, every so often, we collectively pause and draw a moral line around a single tool.

Right now, that tool is AI.

Concerns about AI’s environmental impact, labor exploitation, and long-term consequences are real and worth examining. But it’s also worth asking why AI, specifically, gets singled out—especially when it’s often being used to support reasoning, not replace it.

I think back to a second-grade social studies lesson on industrialization. We learned how a small number of sewing machines could replace hundreds of tailors. The lesson wasn’t framed as a moral failure of the machine; it was presented as a fact of progress—machines made production faster, cheaper, and scalable. The human cost was acknowledged, but the trajectory was treated as inevitable.

That framing matters, because it mirrors how society has always absorbed trade-offs.

We already live comfortably inside a world shaped by environmentally costly systems. Cars, air travel, fast fashion, streaming platforms, industrial agriculture, and global supply chains all rely on massive energy use and low-wage labor, often in developing nations. These systems persist not because they’re harmless, but because they deliver what modern life demands: convenience at speed.

Fast and cheap has always won.

Industrialization didn’t just replace labor; it reshaped expectations. Clothing became affordable. Goods became abundant. Productivity became the measure of success. And while entire professions were disrupted—often at great human cost—society didn’t opt out. It adapted, normalized the new tools, and moved forward.

AI fits squarely into that lineage.

What feels different now isn’t the existence of trade-offs, but how selectively we moralize them. Instead of interrogating the systems that demand constant acceleration, we focus on individuals who use the tools those systems incentivize. We criticize the person instead of the structure.

There’s also an added layer of discomfort: AI challenges how we define thinking itself.

When someone uses AI to compare ideas, articulate reasoning, or stress-test a decision, it exposes something we’ve long avoided admitting—human thought has always been supported by tools. Writing externalized memory. Calculators offloaded arithmetic. Search engines reorganized knowledge. None of these eliminated thinking; they changed its shape.

AI simply makes that process visible.

So when AI is used to clarify reasoning rather than replace it, labeling it as intellectual laziness feels less like a principled stance and more like a reaction to unease. Unease with speed. Unease with shifting definitions of competence. Unease with how much modern life already depends on invisible systems doing work for us.

We want things fast and cheap—but we also want to feel ethically intact.

That contradiction fuels selective outrage. We worry about AI’s environmental impact while streaming endlessly. We decry labor exploitation while relying on global supply chains designed around it. We condemn one tool while quietly accepting dozens of others that operate on the same principles.

The question isn’t whether AI has costs. It does.
The question is why we expect individuals to opt out of this particular tool—especially when used thoughtfully—while remaining fully embedded in every other system optimized for speed and scale.

If the sewing machine taught us anything in second grade, it’s that tools don’t replace values—they reveal them. And the value modern society has consistently chosen is efficiency, even when the costs are unevenly distributed.

Until we’re willing to interrogate that, singling out AI isn’t moral clarity. It’s displacement.

And maybe the discomfort we feel around AI isn’t about environmental harm at all, but about recognizing—again—that we’ve built a world where fast and cheap isn’t just preferred. It’s expected.

Wednesday, January 28, 2026

Religion Was Not the Gift



One of the most persistent myths in American history is the idea that Africans brought to the United States and the Caribbean were religiously blank—that they had no structured belief system until Christianity was imposed on them.

That story is false. And not in a small, technical way. In a completely reshapes how you understand history way.

Africans who were kidnapped and sold into slavery did not come from a single place, culture, or belief system. They came from dozens of societies across West and Central Africa, each with established spiritual traditions, moral frameworks, rituals, and cosmologies. What they practiced were not “superstitions” or vague nature worship, but fully developed religious systems that governed ethics, community life, healing, justice, and identity.

Most of these traditions shared a similar structure: belief in a supreme creator, reverence for ancestors, and engagement with spiritual forces that mediated between the divine and human life. The creator was often not approached directly—not because of primitiveness, but because of theological logic. Intermediary spirits existed for the same reason saints, angels, or prophets exist in other religions.

For example, among the Yoruba people (in present-day Nigeria and Benin), the supreme being was Olódùmarè, with Orishas like Ogun, Oshun, Shango, and Yemoja serving as divine forces tied to morality, labor, fertility, justice, and the natural world. Among the Akan of Ghana, Nyame was the creator, with a strong emphasis on ancestor veneration. In Central Africa, the Kongo people worshipped Nzambi Mpungu and understood life and death as interconnected realms separated by the kalunga line—a cosmology that later surfaced in African American spiritual practices.

And then there’s a fact that often makes people uncomfortable because it disrupts a familiar narrative: many enslaved Africans were already Muslim. Large numbers came from regions like Senegambia, Mali, and Guinea, where Islam had been practiced for centuries. Some were literate in Arabic. Others were already Christian—particularly from the Kingdom of Kongo, which had converted to Christianity in the 1400s, long before English colonies existed.

So no, enslaved Africans were not “introduced” to religion in the Americas. What happened instead was suppression.

African spiritual practices were banned, punished, and demonized. In response, people adapted. They practiced in secret. They embedded belief into music, movement, and oral tradition. They layered African cosmologies beneath Christian imagery. This is how Orishas became associated with Catholic saints, how African rhythms shaped Black church worship, and why call-and-response, shouting, and embodied praise feel fundamentally different from European Christianity.

These weren’t accidents. They were acts of survival.

Understanding this matters because it reframes enslaved Africans not as people who were given culture, morality, or faith—but as people who carried all of that with them and fought to preserve it under unimaginable conditions.

When we erase that truth, we make the violence of slavery seem less total than it was—and we diminish the resilience that followed.

History doesn’t get clearer when it’s simplified. It gets clearer when we’re honest.



Tuesday, January 27, 2026

Politics as Usual

Last year, I wrote to my senators, Ted Cruz and John Cornyn, because I was uneasy about the direction the Trump administration appeared to be taking with its Cabinet appointments. My concern wasn’t rooted in party affiliation or personal animus; it was about whether the people being placed in charge of critical federal agencies had the experience, judgment, and ethical grounding required for the roles they were assuming.

Senator Cornyn responded with a reminder of the Senate’s constitutional responsibility to advise and consent on nominees. That response was procedurally correct, but it didn’t fully address the substance of my concern: what happens after confirmation, when theory becomes practice and decisions begin affecting real lives, institutions, and national stability.

Now, months into these appointments, the consequences are no longer hypothetical.

In national security and defense, the importance of steady, credible leadership cannot be overstated. Pete Hegseth entered the role of Secretary of Defense with a strong media presence and some military experience, but without a background in senior strategic leadership. Since his confirmation, reports have pointed to friction between civilian leadership and career military professionals, along with confusion in communications with allies and service members. These aren’t abstract bureaucratic issues; they affect readiness, morale, and trust at a moment when global tensions demand clarity and coordination.

At the State Department, Marco Rubio brought a conventional political résumé and deep familiarity with foreign policy debates. However, his hardline approach toward adversaries like China and Iran has coincided with increased diplomatic strain. In an era where alliances are already fragile, an overly confrontational posture risks escalation when restraint and coalition-building are often more effective.

The Department of Homeland Security offers another example of how leadership decisions ripple outward. Under Kristi Noem, DHS has faced operational crises and sharp congressional backlash, particularly following enforcement actions that resulted in civilian deaths. At the same time, the removal of climate-related preparedness programs has raised concerns about disaster readiness. Homeland security works best when it is boring, competent, and trusted—not when it becomes a flashpoint for political conflict during emergencies.

The administration’s approach to government restructuring has also revealed tensions between private-sector logic and public responsibility. The creation of the Department of Government Efficiency under Elon Musk and Vivek Ramaswamy was pitched as a bold attempt to streamline bureaucracy. In practice, rapid reorganizations, mass firings, and the removal of public records have weakened institutional memory and raised questions about transparency and conflicts of interest. Innovation matters, but government is not a startup; it exists to provide continuity, fairness, and accountability.

Public health may be where the stakes are most immediate. At Health and Human Services, Robert F. Kennedy Jr. has overseen deep restructuring, including significant cuts to research funding and staffing at agencies like the NIH and CDC. Regardless of one’s views on regulation or federal scope, these institutions form the backbone of disease surveillance, vaccine development, and emergency response. Weakening them doesn’t just affect bureaucrats—it affects families, hospitals, and communities when the next outbreak or crisis emerges.

Veterans Affairs tells a similar story. Doug Collins’s proposal to reduce VA staffing by roughly 15 percent may be driven by cost concerns, but the practical effect is longer wait times and reduced access to care for veterans who already struggle to navigate the system. Fiscal responsibility is important, but it must be balanced against the government’s obligation to those who served.

Beyond these headline roles, the pattern repeats across the administration. Tulsi Gabbard’s service as Director of National Intelligence brings valuable military and foreign policy perspective, but limited operational intelligence experience has complicated coordination across the intelligence community. Dr. Mehmet Oz’s leadership at CMS has been marked by staffing cuts and policy shifts that have slowed access to healthcare services. Mike Huckabee’s appointment as ambassador to Israel highlights the risks of prioritizing ideological alignment over diplomatic training in sensitive regions. John Ratcliffe’s tenure at the CIA raises concerns about managing complex intelligence operations without deep operational background.

Environmental and economic oversight have also shifted. Under Lee Zeldin, environmental protections have been rolled back in ways that reduce accountability for pollution. At Commerce, Howard Lutnick’s focus on private financial interests has left trade policy uneven. Across agencies, figures like Stephen Miller and Kristi Noem have emphasized ideological alignment, often at the expense of operational expertise and humanitarian considerations.

Taken together, these choices suggest a governing philosophy that values loyalty, visibility, and disruption over experience, continuity, and institutional knowledge. That approach may appeal to voters frustrated with bureaucracy, but governing is not the same as campaigning. Institutions weakened in the name of efficiency or alignment are difficult to rebuild, and the costs are often borne by ordinary people rather than political leaders.

When I first contacted Senators Cruz and Cornyn, my argument was simple: ideology and loyalty cannot substitute for competence. Looking at the cumulative impact of these appointments—across national security, public health, veterans’ care, environmental protection, and public trust—it’s hard to argue that those concerns were misplaced.

The American people deserve leaders who are qualified, ethical, and capable of stewarding the institutions that protect us all. When those standards slip, the consequences don’t fall along party lines—they fall on the country as a whole.



Sunday, January 25, 2026

The Irony of Anti-Immigration Rhetoric from the Children of Immigrants


Immigration has long been a contentious topic in American politics, often evoking strong emotions and heated debates about who “deserves” a place in the country. What’s striking—and frequently overlooked—is that some of the loudest critics of immigration are themselves the children of immigrants.

Take Kash Patel, for example. His parents, Premode (or Pramod) and Anjna Patel, were Gujarati Indian immigrants who had previously lived in Uganda and Tanzania before settling in New York. His father was a refugee who fled Idi Amin’s brutal regime in Uganda in 1972, seeking safety and opportunity in the United States. Similarly, Dr. Mehmet Oz was born to Turkish immigrants Mustafa and Suna Öz, who moved to the U.S. in pursuit of a better life. Even Donald Trump, a figure synonymous with anti-immigration rhetoric, is the son of Scottish-born Mary Anne MacLeod, who immigrated to the U.S., bringing her own story of courage and determination. Melania Trump adds another layer to this pattern, having been born in Yugoslavia before moving to America, and Marco Rubio’s parents fled Cuba in 1956 during the regime of Fulgencio Batista, years before Fidel Castro came to power, bringing with them the hopes and fears of those leaving their homeland behind.

The irony here is impossible to ignore. These individuals have all benefited directly from the opportunities, protections, and rights afforded to immigrants in the United States, yet some of them have used their platforms to support policies and rhetoric that demonize the very people who followed similar paths to theirs. It is a stark reminder of how personal history can be selectively embraced—celebrated when convenient, erased when inconvenient.

This contradiction underscores a larger trend in politics: the ability to dehumanize others while enjoying the fruits of one’s own immigrant lineage. Their stories illuminate a fundamental truth about America’s history and identity—it has always been shaped by people who came from elsewhere, seeking safety, stability, and the chance for a better life. Remembering that history is crucial, especially when immigration is wielded as a political weapon rather than recognized as the lifeblood of the nation.


Saturday, January 24, 2026

Four White Men Have More Wealth Than Half the Planet — and Somehow We’re Still Pointing the Wrong Way

There’s a sentence that makes people uncomfortable because it cuts through the noise too cleanly:
Four white men have more wealth than half the planet … and y’all still think a brown immigrant is why you’re struggling.

It sounds provocative, but it isn’t exaggerated. It’s descriptive. And the discomfort it causes isn’t about tone — it’s about what it exposes.

We live in a moment where financial stress is nearly universal. Rent keeps climbing. Groceries cost more every month. Healthcare feels out of reach. Wages lag behind inflation, and stability feels like a luxury instead of a baseline. People are tired, anxious, and looking for answers. And instead of being encouraged to look up — toward power, toward wealth, toward systems — we’re taught to look sideways. Or down.

That redirection is not accidental.

Because at the very top of the global economy sit a small group of men — Elon Musk, Jeff Bezos, Larry Page, Larry Ellison, Mark Zuckerberg, and a few others — whose combined wealth reaches into the trillions. Trillions. Numbers so large they stop feeling real, even as they eclipse the combined resources of billions of human beings trying to survive on a few thousand dollars a year or less.

This isn’t about whether these men are smart or innovative. It’s about scale. It’s about proportion. It’s about what it means when a handful of individuals possess more economic power than entire nations while half the planet lives one emergency away from collapse.

And yet, somehow, the story we’re told is that the problem is immigrants.

We’re told they’re taking jobs.
We’re told they’re draining resources.
We’re told they’re not contributing.

But when you slow down and actually look at the economy, the story starts to unravel.

Take Amazon, for example. Jeff Bezos built one of the most powerful companies in human history — a company that employs millions of people worldwide and touches nearly every aspect of modern life. And yet the median Amazon worker earns a wage that many would struggle to live on, especially as housing and healthcare costs soar. Thousands of Amazon employees rely on public assistance to make ends meet, even as the company generates staggering profits year after year.

At Tesla, workers assemble vehicles that sell for tens of thousands of dollars while many production employees earn wages that barely keep pace with inflation. The labor is essential. The compensation is limited. The ownership, however, is where the real wealth lives.

At Meta and Google, salaries are higher for some roles, particularly in tech. But even there, the gap is breathtaking. A six-figure salary can feel comfortable — even privileged — until you compare it to ownership stakes worth tens of billions. One pays for a life. The other accumulates power indefinitely.

This is the part that rarely gets said out loud: most people work for wages. Billionaires build wealth through ownership.

Wages are finite. You trade time and labor for money. There are ceilings. Limits. Burnout.
Ownership compounds. It grows while you sleep. It scales off other people’s work.

That difference — not effort, not morality, not deservingness — is what creates extreme inequality.

So when politicians, pundits, or social media personalities tell you that undocumented immigrants are the reason you’re struggling, it’s worth asking a simple question: does that actually make sense?

If this were really about taxes, we wouldn’t be targeting people who already pay billions into systems they can’t fully access.
If this were really about work ethic, we wouldn’t ignore the millions of immigrants doing the hardest, least protected labor in the economy.
If this were really about fairness, we’d be talking about tax loopholes, corporate subsidies, offshore accounts, and policies that allow massive wealth to accumulate at the top with minimal accountability.

But we’re not. Because blaming immigrants is easier than confronting concentrated power.

Immigrants are visible. They’re close. They’re vulnerable. Billionaires are abstract. They hide behind corporations, stock tickers, and economic jargon that makes inequality feel inevitable instead of engineered.

So anger gets rerouted. Fear gets weaponized. And people who are barely surviving are encouraged to see other struggling people as the enemy — while the gap between the top and everyone else quietly widens.

The real question isn’t whether billionaires should exist. The real question is why we accept a system where a few individuals can accumulate more wealth than entire countries while the people who make those fortunes possible can’t afford basic stability.

Why are we so quick to punch sideways and downward instead of looking up?

A brown immigrant is not why rent is unaffordable.
A refugee is not why healthcare is inaccessible.
A warehouse worker is not why wages stagnated.

Extreme wealth concentration didn’t happen by accident. It happened through policy choices, corporate power, and narratives designed to keep people divided and distracted.

So the next time someone tells you who to blame for your struggle, pause. Follow the money. Ask who benefits from that story being told.

Because while we argue over borders and scapegoats, a handful of men continue to accumulate wealth beyond comprehension — quietly, legally, and without ever having to explain why so many people are barely surviving.

And that is the real crisis we keep being told not to see.

Tuesday, January 20, 2026

The Illusion of Consistency

There is a moment that shows up again and again in public conversations when the topic quietly shifts. What begins as a discussion about policy, language, or disagreement suddenly stops being about any of those things. Instead, it becomes a test of whether we are willing to extend basic human decency—and whether we are willing to be consistent about it.

That moment doesn’t usually arrive with a dramatic announcement. It slips in subtly. Someone makes a simple observation grounded in everyday life. Rather than engaging with it, the response sidesteps, reframes, mocks, or escalates. Not because the point is hard to understand, but because it asks something uncomfortable of the listener.

The original observation is remarkably simple. For our entire lives, we have adjusted how we address people based on preference. We’ve done it casually and without controversy. “My name is James, but I go by Jim.” “My legal name is Elizabeth, but please call me Liz.” “I don’t like my first name—use my middle name instead.” These exchanges happen in classrooms, workplaces, churches, gyms, and families every single day. They don’t provoke outrage. They don’t require debates about ideology. They don’t inspire threats or insults.

The social action involved is minimal. Someone tells you what they prefer to be called, and you decide whether or not to respect that preference. That’s it.

The current debate around preferred names and pronouns often pretends this request is unprecedented, confusing, or overly demanding. But it isn’t. What’s new isn’t the behavior being asked for. What’s new is the resistance to applying a courtesy we already understand how to give.

One of the most common ways people avoid this point is by retreating into semantics. The response comes quickly: “That’s a nickname.” The goal here isn’t clarity. It’s containment. By narrowing the conversation to definitions, the broader ethical question can be quietly dismissed. If we argue about labels instead of behavior, we never have to answer the harder question lurking underneath: why does personal preference suddenly become unacceptable when it’s attached to an identity some people are uncomfortable acknowledging?

Whether something is called a nickname, a preferred name, or a pronoun does not change the underlying principle. We already know how to adjust how we address people. We’ve always known. The distinction is not about ability; it’s about willingness.

When semantics stop working, the conversation often shifts again—this time away from argument entirely and toward dehumanization. Words like “cultists,” “you people,” or “brainwashed” begin to appear. This language is not accidental. It serves a purpose. Once a group is framed as irrational, dangerous, or less than fully human, their dignity becomes negotiable. Empathy becomes optional. Cruelty becomes easier to justify.

At this stage, disagreement is no longer about ideas. It becomes about allegiance. Political tribalism takes over, and the goal is no longer understanding but winning. You are no longer a person making an observation rooted in lived experience; you are a symbol of a side that must be defeated. Nuance is abandoned. Consistency becomes flexible. Courtesy is treated as weakness rather than a baseline expectation.

The most revealing moment comes when someone invokes state power as a joke or a warning—law enforcement, deportation, surveillance. Even when said flippantly, this move marks a clear boundary. It signals comfort with coercion over persuasion and a willingness to joke about, or threaten, state violence against people they disagree with. At that point, the issue has moved far beyond names or pronouns. The question becomes who is allowed to participate safely in public discourse at all.

What makes all of this especially unsettling is how unnecessary it is. Consistency does not require agreement with every identity or experience. It does not demand full understanding or personal resonance. It only asks that we apply the same standards we already live by. If you have ever called someone by a nickname, respected a stage name, adjusted how you address a married friend, or used a title for a doctor, pastor, or coach, then you already understand the mechanism at work.

Refusing to extend that same courtesy selectively is not a matter of confusion. It is a choice.

So the real question is not whether these situations are technically identical or whether the terminology fits neatly into one category or another. The real question is why basic courtesy is treated as negotiable only when it involves marginalized people.

When the insults, semantic detours, and threats are stripped away, what remains is something very simple and very old. Human decency is not radical. Consistency is not oppression. And the refusal to engage honestly with that reality says far more than any name or pronoun ever could.

Sunday, January 18, 2026

When Rights Are on the Line: Sacred Spaces, Protest, and Accountability in a Fear‑Fueled Moment





In St. Paul, Minnesota, a Sunday service at Cities Church was interrupted by protesters chanting “ICE out” and calling for justice for Renee Nicole Good — a 37‑year‑old mother who was fatally shot by a federal Immigration and Customs Enforcement (ICE) officer on January 7. That moment — intended to be quiet worship — became a flashpoint in a larger national conversation about rights, protest, and the role of law enforcement.

In the aftermath, Christian leaders urged protection for worshippers while also expressing compassion for migrants, highlighting the tension between sacred spaces and civic protest. The U.S. Department of Justice opened a civil rights investigation, citing possible violations of the Freedom of Access to Clinic Entrances (FACE) Act — a 1994 law that makes it a federal crime to interfere with someone’s exercise of First Amendment religious liberties at a place of worship.

But beyond the headlines, this moment reveals something deeper about our national story and how we protect freedom — not just in law, but in practice.

A History More Complex Than the Myth

It’s common in public discourse to say that “our nation was settled and founded by people fleeing religious persecution.” That sentiment is often invoked to frame debates over religious freedom as somehow central to the American identity. But history is far more complicated than that shorthand suggests.

Yes, some early settlers sought refuge from religious oppression. But many of the places they settled were established on land taken by force from Indigenous peoples. Others were deeply involved in the transatlantic slave trade and built their wealth and power on the forced labor of African people brought to this hemisphere against their will. Even within early colonial towns, religious conformity was often enforced, and dissenters were punished.

The point isn’t to dismiss claims of religious motivation — it’s to recognize that the idea of America as a uniformly “religious refuge” simplifies a messy, often unjust past. Laws protecting religious liberty and protest rights weren’t just gifts from that era’s settlers — they were demands made over time, shaped by struggle and legal battles.

Rights Codified in Law and Courtroom Precedent

Today, American rights have far firmer grounding than myth. The First Amendment guarantees the right to freedom of religion, peaceable assembly, and free speech. Supreme Court precedent like Hague v. Committee for Industrial Organization confirms that peaceful assembly is a core constitutional right. Tennessee v. Garner rules that law enforcement may not use deadly force against a fleeing suspect unless there is probable cause to believe the suspect poses a serious threat of death or injury.

These aren’t abstract ideas — they are legal standards with real consequences. The FACE Act exists precisely to prevent intimidation or obstruction at places of worship, precisely because without such laws, religious freedom could be impeded by force or threat. And protections for journalists and observers — including the right to record law enforcement — have been upheld in cases such as Glik v. Cunniffe and others that affirm a free press is essential to democracy.

In Minnesota, protests following Good’s death have spread far beyond one city. Demonstrations in New York, Seattle, and Washington, D.C., reflect deep national outrage over how the shooting unfolded and how federal law enforcement operates; local leaders like Minneapolis Mayor Jacob Frey publicly rejected federal narratives that seek to justify the use of lethal force. Thousands have rallied, vigiled, and marched, underscoring that peaceful protest remains a powerful civic tool.

Between Safety and Suppression

At the same time, officials are warning against “desecration” and vowing prosecutions under federal law. Some commentators even urge charges against journalists who documented the protest — steps that risk chilling lawful reporting. The justice system’s response to Good’s shooting itself has been controversial: state investigators were removed from the process as the FBI assumed control, a move that local authorities say has undermined transparency and public trust.

Yet the law offers frameworks not for fear, but for accountability. Civil rights statutes, Supreme Court precedents, and the First Amendment all reflect an enduring principle: that public spaces, sacred or civic, are protected not because they are uncontroversial, but because they are essential to a free society.

What This Moment Teaches Us

When peaceful worship is interrupted, when protesters raise their voices against what they see as injustice, and when journalists document it all in real time, we witness the living work of constitutional rights. These freedoms were not automatically respected by our earliest ancestors — they were demanded, defended, and refined through struggle.

History is not a simple story of settlers fleeing persecution. It’s a long narrative of people asserting their dignity, insisting on accountability, and refusing to let fear or force override the principles that bind us. Protecting worshippers, safeguarding journalists, and honoring protest rights are not contradictory goals — they are complementary ones.

A society that values each of these rights equally is a society that honors both its laws and its people. And in times of tension, that is the measure of a democracy that still holds true to its own highest aspirations.



Thursday, January 15, 2026

The Illusion of Safety in Everyday Words

There is a particular kind of confidence that comes with reading an ingredient label out loud and pausing dramatically at every multisyllabic word. The longer the name, the louder the suspicion. I recently watched this happen in real time, the conclusion delivered with a knowing nod: If you can’t pronounce it, you shouldn’t be eating it.

The irony arrived almost immediately. The allegedly alarming ingredients were not synthetic poisons or obscure laboratory experiments. They were thiamine mononitrate, riboflavin, niacin, pyridoxine hydrochloride, cyanocobalamin, and folic acid—scientific nomenclature for B-complex vitamins and a form of vitamin B9. Nutrients so well studied that their absence once caused epidemics of preventable disease.

That moment lingered with me, not because it was unusual, but because it was familiar. The phrase itself has become a kind of nutritional shorthand, passed along as wisdom without context, evidence, or curiosity. It sounds like discernment, but it functions more like a filter for discomfort. What we are really reacting to is not danger, but unfamiliar language.

At some point in our lives, most of us could not pronounce the words vegetable or fruit. We learned them gradually, through repetition and exposure, long before we understood photosynthesis, dietary fiber, or micronutrient density. No one suggested we avoid carrots because we mispronounced them. Language acquisition has never been a prerequisite for nourishment.

Scientific terminology exists for precision, not deception. Latin- and Greek-derived names describe molecular structure, bioavailability, and function within the body. Ascorbic acid does not become less effective because it is not labeled “vitamin C.” Tocopherol does not stop acting as an antioxidant because it sounds unfamiliar. Pronunciation is not a proxy for toxicity.

This is where the conversation quietly widens beyond food. The same instinct that treats scientific language as suspicious shows up elsewhere: in public health, in climate science, in medicine. Complexity becomes evidence of conspiracy. Expertise is reframed as elitism. Familiarity is mistaken for safety.

The fear of so-called “chemicals” is a perfect example. Everything we consume is chemical. Dihydrogen monoxide is water. Molecular oxygen sustains life. Botulinum toxin is natural. The distinction that matters is not whether a substance is synthetic or naturally occurring, but its dose-response relationship, its metabolic pathway, and the body of peer-reviewed evidence supporting its use.

Nowhere is this clearer than in food fortification, one of the most successful public health interventions of the last century. Many of the ingredients people recoil from on labels were added intentionally after epidemiological research demonstrated their role in preventing disease at the population level.

Folic acid, the synthetic form of folate, was added to enriched grains after decades of data showed it dramatically reduced neural tube defects such as spina bifida and anencephaly. Iodine, introduced through iodized salt, nearly eliminated endemic goiter and supported normal thyroid hormone synthesis. Vitamin D fortification in milk helped prevent rickets by improving calcium absorption and bone mineralization.

Niacin, or vitamin B3, was added to flour to prevent pellagra, a condition once characterized by dermatitis, diarrhea, dementia, and death. Thiamine supplementation reduced cases of beriberi, a neurological and cardiovascular disorder linked to polished rice consumption. Riboflavin fortification supported cellular respiration and energy metabolism, preventing deficiency syndromes that disproportionately affected low-income populations.

Iron fortification addressed widespread iron-deficiency anemia, particularly among children and people who menstruate. Calcium was added to juices and plant-based milks to support skeletal integrity and reduce osteoporosis risk. Omega-3 fatty acids, including docosahexaenoic acid and eicosapentaenoic acid, were incorporated into certain foods to support cardiovascular and neurocognitive health.

Even fluoride—though added to water rather than food—stands as one of the most rigorously studied and effective public health measures for preventing dental caries. None of these interventions were based on vibes. They were based on randomized controlled trials, longitudinal studies, and decades of biochemical research.

Yet in misinformation culture, certainty often replaces understanding. Slogans outperform nuance. Fear feels empowering because it requires no follow-up questions. “If you can’t pronounce it” becomes a substitute for learning what something is, why it exists, and how it functions physiologically.

This kind of thinking rewards confidence over accuracy and skepticism over literacy. It frames ignorance as intuition and casts curiosity as naïveté. Most dangerously, it teaches people to distrust the very systems—nutrition science, public health research, regulatory review—that have measurably extended life expectancy and reduced disease burden.

A more honest approach to food, and to information more broadly, starts with different questions. What is the compound’s biochemical role? How is it metabolized? What does the preponderance of evidence say about its safety and efficacy? Who benefits from its inclusion, and who would be harmed by its absence?

We deserve better heuristics than fear dressed up as wisdom. If pronunciation were the standard for safety, many of us would have been malnourished long before kindergarten.

Sometimes the ingredients with the longest names are not the problem. Sometimes they are the reason entire populations are healthier than they once were.

Done. I’ve rewritten the piece into a fully narrative essay that subtly but clearly connects the food-label slogan to misinformation culture as a whole, without turning it into a manifesto or losing any scientific detail.