Senate page     Sep. 11

Senate map
Previous | Next

New polls:  
Dem pickups: (None)
GOP pickups: (None)

Please consider observing a moment of silence for the victims of 9/11 on this, the 20th anniversary of that event.

Saturday Q&A

A pretty good mix of questions and, as with last week's mailbag, a certain New-Yorker-turned-Floridian is scarce.

Q: I was supposed to die last year (medical reasons), but told a friend in the same situation (but a different diagnosis) that I intended to live to vote for Joe Biden, and he said he shared the same motivation. We lived and voted and kept living to observe the results. Biden has now had 7 complete months in office. Would you list 7 times that he's shown leadership (not necessarily month-by-month). The first one is easy—yesterday you gave us "Biden Lays Down the Law." Only 6 to go. M.G., Boulder, CO

A: First of all, we're glad we didn't lose you!

As to your question, Barack Obama famously observed "Elections have consequences." We are going to adapt that into a loose definition of leadership: "Making choices that have consequences." It's not really leadership to rename a post office or to pose for a photo-op with the team that won the Super Bowl. And with that conceptualization in mind, here are our six additional examples of leadership from Biden, ranked in rough order of how consequential each was:

  1. Choosing the cabinet, the most diverse in U.S. history
  2. Getting nearly all schools reopened
  3. Influencing the infrastructure bills (this will move up the list when and if they are passed)
  4. Committing to, and delivering, 200 million vaccine shots in under 100 days
  5. Influencing and signing the COVID relief bill
  6. Withdrawal from Afghanistan

If we were to include the laying down of the law in the list, we would put it between #2 and #1.

Q: I want to state upfront that I applaud President Biden's efforts to vaccinate every American and stop the spread of COVID. Stepping back for a moment and looking at the situation from a coldly political perspective, though, wouldn't the Democrats stand to benefit at the polls if they left the un-vaccinated alone at this point? Since 99% of those dying from COVID are unvaccinated, and since Republicans make up the vast majority of the unvaccinated, wouldn't this mean fewer Republican voters in 2022? By the same token, I do not fully understand why the Republicans are not more pro-vaccine; the Republican Party cannot afford to lose any more voters. It's like the Republicans are intent on shooting themselves in the foot on this issue. W.R., Tysons Corner, VA

A: First of all, Biden really believes that he is president to all Americans, and would never be that mercenary. Second, some of the groups that are undervaccinated—Black people, young people—skew heavily Democratic. Third, and perhaps most important, those who use biological agents as a weapon (or who passively allow them to act as a weapon) invariably learn the lesson that germs have no loyalty, and eventually turn on their "masters." If Biden were to allow natural selection to take its course in red states, he would risk the development of a new, vaccine-resistant COVID variant. And if that variant were to reach big cities, which it surely would, then all of a sudden the majority of the deaths would be Democrats.

As to the Republicans, it basically boils down to their being in the thrall of someone who does not mind being a hypocrite, and whose followers don't care if he's a hypocrite. His personal needs meant taking the virus seriously, which he did, getting vaccinated and also taking advantage of cutting-edge treatments with some scientific basis. His political needs meant turning the virus into a "culture wars" issue, which he did, lamenting masks and vaccines. At this point, it's beyond his control; when he comes anywhere near a vaccine endorsement, his followers boo him. And if it's beyond his control, then it's certainly beyond the control of his wannabe clones, like Govs. Ron DeSantis (R-FL) and Greg Abbott (R-TX).

Q: I know you mainly cover politics but since you dabble in COVID news and are very data driven, I was curious if you had seen this peer reviewed study that shows ivermectin to be 83% effective at preventing COVID-19 infection? S.H., San Francisco, CA

A: Note that we're not Anthony Fauci, and that our staff virologist has decided to pursue his dream of becoming a scuba pizza delivery man. However, we can certainly evaluate the quality level of a journal, and we can read an abstract.

Let us start with the reasons to be cautious. It's only one study, and before any study findings can be taken seriously, they have to be affirmed by an additional study (or, ideally, by additional studies). Further, Cureus' model is, in effect "quick turnaround." The general idea is that scientists should be as up-to-date on their colleagues' work as is possible. So, the peer review here is not as rigorous as with most other journals. Cureus isn't a predatory journal, where people pay a fee and publish basically anything they want, so as to get a line for their curriculum vitae. But it ain't Nature or Science, either.

As to the study, they do make a point of repeating that 83% number a lot, because it is pretty compelling. However, there are some rather significant qualifiers. First of all, the study participants were all healthcare workers. The paper does not make clear how many of them were vaccinated (and failure to include that information is something of a red flag), but surely many of them were. On top of that, they surely all took the general precautions medical professionals take, like wearing gloves and masks. And so, the ivermectin was not administered in a vacuum; it was given alongside other preventive measures. They do note that "Ivermectin is a safe and effective strategy to prevent COVID-19, in the containment of pandemic alongside vaccine." (emphasis ours).

Further, the study participants either took zero, one, or two (small) doses of ivermectin. Of the 1,147 folks who did not take the pill, 133 (11.5%) developed COVID-like symptoms within the next month. Of the 2,199 folks who took two doses, 45 (2%) developed COVID-like symptoms within the next month. So, that 83% drop was really a drop from 11.5% to 2%, which is statistically significant, but isn't quite as impressive as a framing that gives the impression that the vast majority of people could be saved by taking ivermectin.

Ultimately, the study's conclusion boils down to this: A small amount of ivermectin, used in conjunction with other preventative measures, may help increase resistance a bit. This seems plausible to us, but again, we'd want to see more studies. In any event, that is way different from this: A huge amount of ivermectin, used as the sole preventative against COVID, will cure/prevent the disease. And that latter assertion is what many right-wingers are peddling.

Q: I believe that you have pointed out at least once that the Reconciliation Infrastructure Bill will be spent over ten years. Why are we still calling it "the 3.5 trillion dollar bill"? Would it not be more palatable to moderates if we called it a "350 billion dollar per year bill"? The GOP supporters are hard-core at using verbal tricks to load the dice; we need to fight back. S.Z., New Haven, CT

A: The problem is that while Joe Biden & Co. certainly don't want to scare off folks like Sen. Joe Manchin (D-WV), they also want voters to think that the Democrats are doing BIG things. And that impression is best encouraged by regularly using the total figure, not the annual figure.

Q: You wrote: "...Texas adopted an anti-abortion bill designed to do an end run around the legal system. It makes those who obtain abortions after six weeks of gestation guilty of a criminal offense..."

I haven't seen anyone report that there's a criminal offense. I have seen many media outlets report that you can sue someone responsible for the procedure.

Can you shed some light on where the criminal penalty is?
R.C. in St. Paul, MN

A: You're right, that wasn't expressed in a very accurate fashion. The bill does not explicitly criminalize abortion, because that would give the Department of Justice a clear target to go after. However, it is written vaguely enough that a creative judge could plausibly impose penalties on a woman (or a doctor).

With this said, the common framing that "Texas bill bans abortion after six weeks" is also somewhat imprecise. Because there is no explicit criminal or civil penalty for the woman getting the abortion, it technically doesn't ban abortion at all. It just makes it nearly impossible to get one (at least, without leaving Texas) by punishing anyone who might help out after six weeks. And "six weeks" itself is also imprecise, as the count begins with the last menstrual cycle, and not with impregnation, so it's actually something more like "a few weeks" (or less).

Q: Arnold Schwarzenegger says that Gov. Gavin Newsom (D-CA) did not handle the coronavirus well.

With the glaring exception of the Governor's initial December 2020 misstep with both the awful French Laundry optics and perhaps re-opening California too soon, am I wrong to think that for the most part it seems that he has not mishandled the pandemic at all? Perhaps not perfectly, but certainly not disastrously.

It seems no state has escaped the pandemic unscathed. However, in your opinion, was/is there any governor who deserves props for handling the pandemic not only well, but better than Newsom?
L.K. in Los Angeles, CA

A: It would seem to us that the governors who did best have several characteristics in common: (1) they made more good decisions than bad ones, (2) they put public health first and politics second, and (3) they kept citizens informed.

There are some states where the governor was dealt a much easier hand when it came to COVID. Vermont has the lowest per capita infection rate of any state, but that is due significantly to its being fairly sparsely populated, having few big (or even medium-sized) cities, and having a relatively low number of visitors from outside the state/country.

Among the more challenging states—fairly populous, some/many big cities, lots of interstate and international visitors—it appears that Newsom, Andrew Cuomo, Jay Inslee (D-WA), Jared Polis (D-CO), and Larry Hogan (R-MD) all did pretty well. They each had missteps, like Cuomo's nursing home scandal, but overall they seem to have met the moment. There are other folks who would have been on the list, like Mike DeWine (R-OH), but they eventually allowed politics to interfere too much in their decision-making process.

Q: You wrote: "Yes, [Feinstein] could resign this weekend and Newsom could pick a replacement on Monday, right before Tuesday's special election..."

But the changing of the guard does not take place at the time of the election, does it? Wouldn't Newsom have a month left in office? She could resign during that month and Newsom could name himself as her replacement.
J.R., San Francisco, CA

A: By the terms of California law, election officials have 28 days to certify the election, and then there can be up to 10 days between final certification and the swearing-in of the new governor. In the last recall, Gray Davis was clearly beaten, such that he conceded on the day of the recall (Oct. 7, 2003). However, his last day in office was Nov. 7, 2003.

However, it is not 100% clear that if recalled, the governor retains full authority between the recall election and the swearing-in of a successor. In normal lame duck circumstances, an officeholder has nonetheless been elected to serve during the lame duck period. In this case, however, the officeholder has literally been un-elected for that period. Should Newsom be recalled (and apparently lose), and then try to appoint a replacement senator while waiting for his replacement to be sworn in, the RNC would file a lawsuit instantly, and they might just win. That could lead to a constitutional crisis though, because Art. I Sec. 5 of the Constitution says: "Each House shall be the Judge of the Elections, Returns and Qualifications of its own Members ..." The Democrats in the Senate might just say to the Supreme Court: "You have no jurisdiction over who is a member of the Senate, so we are going to ignore your ruling and seat Newsom's appointee. Sorry about that."

Q: You have recently posted a recommendation on who Democrats should vote for if Gavin Newsom is indeed recalled. Would it be possible to get a reminder, so I can send it to my friends in California? I.M., New York City, NY

A: Please note this is not an endorsement of the sort that newspapers do. We did not review the candidates and choose the best one. No, we are being guided entirely by the polls. And what the polls say is that the only two people with a chance to win the replacement election, should it become necessary, are Larry Elder and Kevin Paffrath. The former is ultra-right-wing, and the latter is a centrist who is currently calling himself a Democrat. Democratic voters thus have a choice of wasting their vote, supporting a Ron DeSantis of a different color, or supporting someone who at least agrees with the Democrats on some issues. If you're a member of the blue team, none are great options, but Paffrath is clearly the least bad of the three.

Q: If the "retain Newsom" vote wins, as I expect it will easily, will they even bother to report the votes on the replacement race, since that part of the ballot will be moot? L.S., Greensboro, NC

A: Yes, they will. First of all, election officials do not have the authority to decide which results people need to know and which ones they don't really need to know. They have to report them all. Second, in this particular case, Republicans are already promoting the narrative that the election is rigged. The totals for the replacement candidate, even if they become irrelevant, should help to push back against that narrative.

Q: Given the systematic errors made by pollsters in 2016 and 2020, how much confidence should I have in the recall polling that shows the recall failing? S.C., Mountain View, CA

A: With apologies to Mark Twain, the errors of pollsters have been greatly exaggerated. They did blow a few states in each of those two elections, although all but one in each case were within the margin of error (Wisconsin in 2016, Florida in 2020). They also blew some Senate elections big-time in 2020, most obviously Maine. But again, it wasn't a train wreck.

Because the important question in the recall is a yes/no, that's a bit easier to poll. People are less likely to flip from one side of that to the other than they are to flip between "third party candidate" and "major party candidate that I don't really like that much." Further, the main problem with polls in 2020 (and 2016, for that matter) appears to have been properly accounting for Trump voters. However, Trump got only one-third of the Golden State vote in 2020, and less than that in 2016. So, the challenges of polling Trump voters should be much less than in a state like Florida, where he got 50% of the vote.

Q: You referred to The United States as a low-corruption country. Really?

The U.S. is the largest military spender in the world. We spend more than the next ten countries combined, including China and Russia. Politicians win elections "supporting the Military'" when all they're really supporting is the military-industrial complex.

And every year we spend more. Our 2019 spending was a 7.22% increase from 2018 which was a 5.53% increase from 2017. Almost none of it is audited or is legally allowed to be audited.

Plus an estimated $21 trillion has simply gone missing. That doesn't include the trillions being spent building weapons nobody wants or needs so the military-industrial complex can continue getting obscenely wealthy off the taxpayers.

What remains missing from most discussions is that we need to wage endless wars to justify all this. Otherwise, we would have more than enough money to end poverty and homelessness in the U.S., and give every American healthcare and a college education with money to spare.

Sounds pretty corrupt to me. Do you really think the U.S. is a low-corruption country?
S.S., West Hollywood, CA

A: This is not an area we have any expertise in, so much so that we would not presume to make any judgments. As we stated in the answer, we were merely repeating the conclusions reached by folks who are experts in corruption.

That said, this is clearly a tough thing to quantify. The best-known ranking is the Corruption Perceptions Index (CPI), and it's been criticized for some of the issues you raise: focusing on one type of corruption to the exclusion of others, prioritizing perceptions over reality, and not paying enough attention to the private sector's role in promoting and facilitating corruption.

You specifically mention the Military-Industrial Complex (MIC). And there's certainly a case to be made for that as a form of corruption. The CPI folks' definition of corruption is "abuse of entrusted power for private gain," and the MIC would seem to fit. Certainly, Dwight D. Eisenhower felt that way, and Gen. Smedley Butler concurred.

Q: Which state has the best motto? A.J., Baltimore, MD

A: We thought we might answer this by reading a list of mottoes, and seeing how many of them were so on target that it wasn't even necessary to see the name of the state to know which state's motto it was. As it turns out, there really aren't any like that, because they are mostly empty platitudes, like "Wisdom, Justice, Moderation" (Georgia) or "All for our country" (Nevada). There are a couple that are identifiable because they're in a distinctive native language, like "Ua Mau ke Ea o ka ‘Āina i ka Pono" (Hawaii), or that might be inferred if you understood the reference, like "Eureka" (an allusion to the California gold rush). But there wasn't a one that we felt captured the essence of the state in a really clear way.

That being the case, we're left to just pick the one that sounds the best. And for that, we will go with Wyoming: "Let weapons yield to the toga."

Q: We're familiar with Fox's primary audience (elderly conservatives), but what demographic watches the Sunday morning news/political opinion shows?

I am a hardcore consumer of political and foreign policy news, but I have never watched any of these programs, probably because I never watch any daytime television programming. So who is tuning in on Sundays? Not the churchgoers, nor anyone with school-age children. The networks regard the shows as prestige programs because of who they employ as hosts, with their considerable salaries.
M.M., San Diego, CA

A: The shows actually pull pretty pedestrian ratings. "Meet the Press" is the highest-rated by a fair margin, and it has about 500,000 viewers at any given time. By contrast, Fox's primetime schedule (Sean Hannity, Tucker Carlson, etc.) often attracts 2 million, and a popular primetime network TV program, like "Dancing with the Stars" or "The Voice," gets 6-10 million.

To a very large extent, these news shows exist to tighten the relationship between the networks and Washington insiders. By a fair margin, the city with the largest viewership for the Sunday morning programs is...Washington, D.C. The shows afford politicians a chance to make a statement on the issues of the day, or to plug whatever they're trying to plug, or to spin whatever they're trying to spin, largely unencumbered by time constraints. Further, the Sunday morning programs get lots of Monday press coverage, which is good publicity for the networks and also pleases their politician-guests.

In terms of the demographics who watch the shows, the predominant one is middle-class and wealthier people, ages 25-52.

Q: How can I find a breakdown of the 2020 Presidential vote by race, ethnicity, education level (eg. only high school, only community college, etc.), age, rural-suburban-urban, religion? D.S. in New York City, NY

A: Such analysis is dependent on exit polls, since that's the only way to collect that data. And the people who are best at collecting and crunching exit poll data are the folks at Pew.

Q: We are hearing a lot about the fraudulent audits being pushed by the GOP, but are legitimate audits a regular part of election administration? Do any states conduct regular, random audits of election results to make sure reported totals match actual votes? M.H.S., Louisville, KY

A: Yes. There are thousands of audits each year. As you correctly guess, election officials double-check randomly selected samples of votes (particularly those cast using machines) to make sure everything is working correctly. And most places have laws that call for an automatic, comprehensive audit if the results are close. Sometimes there are two or three of those, as was the case in Georgia last year.

Q: When projecting the outcome of Congressional elections, districts are usually described as R+2 or D+2. However, around a third of all voters are independents or non-aligned. Do those numbers represent party registration? The outcome of past elections? It seems to be that if most non-Republicans get fed up with GOP lies and conspiracy theories, a R+5 could easily become of D+5 very quickly. What do you think? J.D., Rohnert Park, CA

A: The numbers are based entirely on the two-party vote in the most recent two presidential elections. A person who votes third party is thus not included, though most people who are registered third party/independent ultimately end up voting for a major party candidate, and so they are included. Party regstration plays no role in determining the PVI, only the actual votes. It's true that a district can swing a fair bit, though 10 points would be a lot since, again, most people end up voting for a major-party candidate, and so are already accounted for by PVI.

Q: If Rep. Lauren Boebert (R-CO) chooses to retire or loses her reelection bid, is she entitled to a lifetime Congressional pension? Will it be a full amount or will it be based on time in office? Do her children also stand to benefit with a paid education? Is this why she ran: In order to obtain lifetime benefits? M.C., San Francisco, CA

A: Congress offers four different pension plans, and so the exact answer depends on which one Boebert chose (or if she chose one at all; a few Republicans, like Ron Paul, have declined because they disapprove of pensions). However, members still have to pay in to whatever plan they pick. And to get a full pension, they have to reach the age of 62 and have at least 5 years of service, or they have to reach the age of 50 and have at least 20 years of service. Boebert is not close to any of these cutoffs, so if she were to leave Congress in 2023 and take her pension immediately, she wouldn't get much (a few hundred dollars a month). If she waited until she turned 62, she'd get more, but not a whole lot more (maybe $1,000 a month in current dollars).

There are no additional benefits, other than a person who retires from the federal government with a pension and who has held health insurance with the plans available for 5 years, has the option to continue using health insurance for themselves, their spouse, and their dependent under-25 children (with the government paying the employer portion of the premiums). Members of Congress can choose their health insurance from the Affordable Care Act (ACA) exchanges. There's no free education for kids, regardless of length of service.

Q: The September issue of National Geographic led with an article whose subtitle was "RACE IS A SOCIAL CONSTRUCT, NOT A BIOLOGICAL TRAIT. THAT'S THE SCIENTIFIC CONSENSUS—SO WHY DO MANY STILL DOUBT IT?" I've seen something similar said on this site. I would agree that most of what we perceive as racial differences are cultural, economic, and historical effects, but after a lifetime observing visible physical differences between populations from different parts of the world, "no biological difference" is hard to choke down. Before I push back against this restatement of reality, I'd like to make sure I understand how the people saying this define "race" because I'm pretty sure it is different from mine. Can you explain it to me? R.T., Arlington, TX

A: If it helps, (Z) asked this exact same question of his TA in Anthropology 15 (human evolution). And the TA had no answer, other than to get irritated.

Anyhow, the general argument here is that racial categories are somewhat arbitrary, and are fungible. "Irish" used to be a race, and now it isn't. "Native American" didn't use to be a race, and now it is. "Jews" were (and sometimes still are) slurred as "a race," because it was a useful way to make them an "other." Even within races there is a lot of arbitrariness. Why are Chinese people and Indian people in the same racial group? Why are Iranians and Syrians in different racial groups? Who's the jerk that made it possible for Americans and Canadians to be in the same race?

In the past, racial groupings were rooted in many different things, including place of origin, tribe or caste, religion, and appearance. Today, racial groupings are almost always based, at least in part, on appearance/phenotype. And the various phenotypes sometimes correlate with certain physical and physiological differences. Ethiopians tend to be particularly suited to marathon running, many Japanese people do not metabolize alcohol effectively, Brazilians are particularly prone to diabetes, and so forth. There are, of course, socioeconomic and cultural factors at play here, in addition to underlying genetics.

What we are trying to say is that the characteristics we use to "organize" people into race sometimes line up with additional physical characteristics. Those additional characteristics have nothing to do with society's putting those folks into a racial group; it's just a coincidence. This is a subtle point, perhaps; enough so that (Z)'s TA couldn't explain it.

Q: After reading last week's Q&A, I did a quick search of Thomas Jefferson and "religion of one," but failed to find Jefferson's use of that phrase or its meaning. Could you enlighten us? J.K., Silverdale, WA

A: Jefferson's views on religion were always evolving. He studied many religious traditions, and tended to pick and choose the bits and pieces he found agreeable. For example, he thought it made sense that God was more like a divine clockmaker rather than a personal deity, which is an idea that comes from the deists. He loved the philosophical elements of the New Testament, but thought the miracles and magical stuff were silly. He hoped there was some sort of afterlife, but was not sure. Jefferson never found a religion with views that aligned with his, hence his sense that he was unique.

The "religion of one" construction was something he apparently used in conversation a few times. When he was writing, he tended to be a bit more formal and a bit more poetic. And so the phrasing that is fully documented is "I am of a sect by myself," which obviously means the same thing, and just sounds a little more majestic. Here is the letter where he wrote that, and also expounded a bit on his beliefs.

Q: The New York Times published this article attempting to explain why the U.S. is one of the few countries in the world that is now restricting abortion rights. Among their answers was "minority rule," highlighting in particular the Supreme Court:

Electoral College and Senate maps have always tilted American elections to favor certain voters over others, for instance by granting rural states outsized representation. For the first time in American history, demographic groups that tend to support one party, the G.O.P., overwhelmingly cluster in the areas that receive disproportionate voice. As a result, Supreme Court justices are increasingly likely to be appointed by a president who lost the popular vote and confirmed by a Senate elected by a minority. Republicans won the national popular vote in only one out of the last eight presidential elections, but have appointed six of the nine current Supreme Court justices.

The article's focus on the Supreme Court is a bit misplaced—by overruling or limiting Roe v. Wade, the Court can enable abortion restrictions, but the actual imposition of restrictions on abortion requires action by legislatures (which of course may also reflect minority rule if gerrymandered and/or determined geographically skewed like the Electoral College). But my question is about the assertion that this is the "first time in American history" when a particular political party's adherents have "overwhelmingly cluster[ed]" in the small states to which the structure of the Senate and Electoral College give outsize representation. I'm willing to believe that population has never been this skewed between relatively few populous states and relatively numerous less-populous states. But I'm skeptical that the type of geographically-correlated partisan division is really unprecedented. With your expertise in both electoral votes and history, you seem well-positioned to answer: did the Times get this point right? S.G., Newark, NJ

A: We would say they did not.

At the outset of the Civil War, there were 5.5 million free people and 3.5 million enslaved people in the Confederacy. There were 18.5 million people in the North. That means the latter population was more than double the population of the former. If you exclude the 3.5 million folks held in bondage (since the Constitution certainly did at that time), then the North had well more than triple the population of the South. And yet, the 15 presidents before Abraham Lincoln included 9 Southern slaveholders, and another 2 staunchly-pro-South Northerners (James Buchanan and Franklin Pierce; they were called "doughfaces"). Through 1850, the Southern slave states controlled half the Senate (and they controlled almost half thereafter). At the dawn of the Civil War, a majority of Supreme Court justices were Southern slaveholders, including Chief Justice Roger Taney.

In short, antebellum Southerners clearly exercised grossly disproportionate control over the federal government relative to their numbers, even more so than Republicans today. In order to make the Times' statement true, you really have to start splitting hairs. You could argue that the pro-slave forces were a faction and not a "party," although most of them were Democrats before 1850, and all of them were Democrats thereafter (once the Whig Party collapsed). You could also argue that there were Democrats outside the South, and so the party wasn't really "geographic." Maybe so, but it's also the case that there are Republicans in all 50 states today.

Q: Thank you for your synopsis of which Civil War monuments were placed when, where, why, and by whom in the generations after the war.

Could you offer a similar synopsis of what happened to the original Southern war dissenters during that time? When the war was breaking out, many people across the South opposed secession. Every state except South Carolina provided at least one regiment to the Union armies. In some areas the minority viewpoint was quite sizable, proportionally. How did these voices become subsumed into the new narrative of 1890, and when? Or did they? For example, were no monuments to Lincoln ever placed anywhere by them in the South?

Perhaps those people were intimidated into silence during the war? Did they simply close ranks in solidarity with their neighbors once things got started? Perhaps they changed their minds in the face of the deprivations of war itself (hard to empathize with the Union when William T. Sherman is burning your food)? Perhaps they were pro-Union but also pro slavery, and Lincoln's freeing the slaves flipped them?
J.G. in Albany, CA

A: As your question suggests, it's complicated. Some pro-Union folks were eventually won over to the Southern cause, for one reason or another. Others stayed pro-Union but kept that under their hats. Still others were quite vocal about it. The best known examples in the latter group were giant-of-Texas-history Sam Houston, whose unionism made him a pariah before he died in 1863; Newton Knight, who led a behind-the-lines guerrilla resistance, and whose story was recounted (and heavily dramatized) in the 2016 Matthew McConaughey movie The Free State of Jones; and the people of Western Virginia, who decided to secede from their state, creating the state of West Virginia. After the war, being pro-Union remained a socially problematic thing in most parts of the South, and such folks were slurred as "scalawags," particularly if they presumed to join the Republican Party. The historian James Alex Baggett has written a book about these unionists, both during and after the Civil War, appropriately titled The Scalawags: Southern Dissenters in the Civil War and Reconstruction, if you would like to read more.

As to monuments to Lincoln, putting something like that up in the postbellum South would not have been wise, since that was a time when Southern whites thought nothing of taking the law into their own hands. In other words, such a statue would surely have been destroyed, and its installers might have paid a price for their offense. The only notable Lincoln statue in the South is the one of him and Jefferson Davis at Vicksburg, which appears to show the two presidents having a nice chat, despite the fact that they never actually met. It was not installed until 2001.

Q: Given that thousands of free Black folks, many of them escaped slaves from the Southern states, fought for the Union, it has always struck me that nobody has ever seemed in a rush to build statues of them to commemorate their service. These men, after all, put their lives on the line to serve the people of Virginia, North Carolina, Tennessee, etc. by liberating them from enslavement, and to preserve the Union of the states.

Do such monuments exist? If not, what do you suppose accounts—in 2021—for this lack?
J.T. in Greensboro, NC

A: Monuments cost money. A lot of it, usually. You have to hire a sculptor, you have to pay for the sculptor's materials, you maybe have to pay for a base, and you might also have to acquire the land on which the statue will rest.

In the era where people were highly motivated to build monuments (1860s-1920s), the folks with money were generally not that interested in recognizing the contributions of Black people. And in the era where people were highly motivated to recognize the contributions of Black people (1960s-present), the folks with money were not that interested in building monuments anymore.

That said, there do exist two very famous monuments to Black Civil War soldiers. The one produced by the Civil War generation is located in Boston, and was funded by wealthy (mostly white) former abolitionists. It was executed by the sculptor Augustus Saint-Gaudens, was unveiled in 1897, and honors the 54th Massachusetts Infantry Regiment (which was the first formal "Black" regiment—though it had white officers—and was commemorated in the movie Glory). The modern one is located in Washington D.C., outside the African-American Civil War Museum. It was executed by the sculptor Ed Hamilton, and was unveiled in 1998.

Q: If the Confederacy had successfully separated from the United States, I see no indication that they would have done any better at managing their finances than Texas did when they tried to be an independent state and failed miserably. I got to thinking that they might have become a client state of the British Empire to keep their finances from collapsing. I'm interested in your take on how that would change history. The U.S. would have had the Canadians on the north and the Confederacy on the south, both aligned with the British Empire. How would that have gone? Would the Confederacy have been a money pit for the British Empire (as it has been a money pit for the U.S.) and would it have brought down the British Empire faster? G.W. in Oxnard, CA

A: We think you're right about the economic picture for an independent Confederacy. There are really two major problems. First, the original 13 colonies had enormous trouble building a viable economy as a loose confederation, and struggled to deal with interstate trade, coinage, banking, and a host of other issues. There's no reason to think the Confederacy would have done better. Second, the Confederacy was a one-commodity country, and that commodity was cotton. But partly by coincidence, and partly due to the pressures created by the Civil War, Egypt became a major producer of cotton (and thus a major competitor to the South) in the early 1860s. Egyptian cotton was generally preferable to what the British and French manufacturers that had been purchasing from the South, as it was less far away and was of higher quality. As a result of this, the price of Southern cotton crashed, such that they were getting about $100/bale at the height of their power in 1860, and then they didn't get $100/bale again until...the 1980s.

"British client state" does not seem a particularly viable outcome to us. Britain was not in the client-state business back then, they were in the colony business, and the Southerners were not going to submit to that. Also, the U.S. government would surely not have tolerated having British holdings on both sides of the country. More probable is that the Confederacy would have formed some sort of Pan-American trade alliance with the remaining slaveholding countries, like Brazil, to see if they could hold things together. Alternatively, they might have had to take the Texas route, going hat-in-hand to the U.S. to ask to rejoin (surely without slavery).

Q: The letter from D.B. in Deer Park got me wondering about presidential security, and when it got serious. I recall going downtown for an eye doctor appointment when I was young. After the appointment my mother and I walked down the street, and were standing amid a crowd of people in front of Hudson's when out of the blue (at least to me) a convertible bearing a smiling and waving President Kennedy and Governor Swainson sitting up on the back deck slowly cruised past about 20 feet away from me on Woodward Avenue, a street lined with hundreds, if not thousands of openable windows, and thousands of un-vetted pedestrians. When Kennedy was shot a year later, I could easily understand how a lone gunman could have pulled that off, with or without assistance from shadowy characters or grassy-knoll dwellers. Flash forward to 2005, when George W. Bush visited suburban Detroit, spoke to a vetted, invitation-only crowd at a community college auditorium, then was whisked away in a motorcade along closed roads to Selfridge Air Base, with helicopters hovering overhead, where he waved briefly from the stairs of Air Force One to the distant, vetted crowd.

I don't want to see any president assassinated (despite the whisperings of the little fellow on my left shoulder regarding one particular holder of that office), but we have definitely lost something when our leaders must be hidden behind so many layers of security. What sort of security was provided for presidents and other important officials in the earlier years of the country?
S.S. in Detroit, MI

A: There's a bit of trivia that you'll sometimes hear that the last official act that Abraham Lincoln performed before heading out to the theater on Apr. 14, 1865, was to sign into law a bill creating the Secret Service. Oh, the irony! This is actually true, but it's not so meaningful when you learn that the USSS's original job was to combat counterfeiters and not to protect presidents. If Lincoln had been killed by a paper cut from a fake $20, then it might be ironic.

Anyhow, for a very long time, the idea of presidential security was seen as undemocratic—putting walls between the president and the people—and also as un-masculine and cowardly. Many presidents, notably Andrew Jackson, just protected themselves. Others hired a bodyguard out of their own pockets. For example, Lincoln's bodyguard was named Ward Hill Lamon. Unfortunately for Abe, he had sent Lamon off on an errand on the fateful night, and the bodyguard who was pinch hitting (John Frederick Parker) was lazy and a drunk. If he was still alive, he might be confused with our staff mathematician.

It was after the assassination of William McKinley in 1901—a.k.a. the third president to be assassinated in less than 40 years—that the government got serious about protecting presidents. Still, security was somewhat loose through the 1960s, such that Kennedy observed (correctly, unfortunately) that if a man wanted to trade his life for the president's he could do it. The USSS has learned from the mistakes it made with Kennedy (and with Gerald Ford, who came close to being assassinated two times in one month), and now runs a very tight ship. It would be very, very difficult to get to a U.S. president today, even if you were willing to trade your life for his.

Q: You have correctly observed that Eugene McCarthy, entering his 20th year in Congress as a Democrat, was nonetheless virtually unknown until his surprise showing in the 1968 New Hampshire presidential preference primary. Four days later, Robert Kennedy, a well-known freshman senator and former U.S. Attorney General, announced his candidacy. Should the non-serious and nonviable lightweight McCarthy have accepted his place in the pecking order by withdrawing? At the risk of coming across as a political science teacher, which I once was, what is the theoretical function of political parties in a democracy, and how would you apply it to this situation? G.H., Chicago, IL

A: One time, a student asked (Z): "What do the parties believe?" And (Z)'s answer was, "The parties, by definition, don't believe anything. Their purpose is to win elections, and they embrace those policy positions and those candidates who they think can best achieve victory. Individual party members, whether politicians or rank-and-file voters, believe things, but parties do not."

The goal of a candidate, meanwhile, is to promote themselves and possibly their political program. A party and a candidate may eventually come together in a symbiotic relationship, because each needs the other to achieve their ends. However, make no mistake that the goals of the one, and the goals of the other, are not the same, and are often in conflict.

As to McCarthy, U.S. history has had its share of dark horse candidates who came from nowhere to win their party's nomination, and sometimes even the presidency. He was clearly experienced enough to serve in the White House, if elected. And many voters clearly liked what he had to say. The DNC might have preferred that he withdraw quickly, but too bad for them. He earned his shot, and had every right—and perhaps even a duty—to pursue the nomination until it was certain he was no longer viable.

Q: You wrote: "Los Angeles is not Chicago; the mayor does not have an outsized influence on state or national politics."

So the implication is that Chicago's mayor does have an outsized influence on state or national politics? Could you say why that is or give some examples?
J.H., Boston, MA

A: Chicago has been called "the last urban political machine," and for good reason. Long after NYC's Tammany Hall had gone the way of the dodo, the Windy City was still under the thumb of powerful urban bosses, who often also occupied the mayor's office. Richard J. Daley, who ran that city from 1955-76 is the famous modern example, though his son Richard M. Daley was no slouch in the political muscle department. More recently, the man with many strings in his hands has been Michael Madigan, though he was speaker of the Illinois house, and not mayor.

Anyhow, Chicago mayors controlled (and, in some cases, still do control) much of the patronage in the city. Sometimes this control was backed by organized crime, as well. This affords enormous influence over people already in office, and also over aspiring future officeholders. And so, it was plausible for a person like Daley Sr. to swing the city, and sometimes with it the state, behind a favored Democratic presidential candidate. He most certainly put Illinois in John F. Kennedy's column in 1960, probably aided by a fair bit of chicanery.

Q: There are plenty of over-the-top, ridiculous, discredited 9/11 theories. But some of the concerns and accusations seemed reasonable to me back in the days after the attack when I was riveted to the chatter. My question is: Are there credible resources that challenge some of the official claims and accepted narrative about the attacks? M.R., Cascade, CO

A: You can group things that are at variance with the accepted narrative into three broad categories: (1) the sorts of disagreements that come from different people having different experiences and different memories, (2) assertions that one major part of the accepted narrative is a lie, and (3) assertions that the entire accepted narrative is a lie. The latter two are conspiratorial, the first is just the way life works. You can find examples of things in category 1 easily enough, but there is nothing credible out there that would be in category 2 or category 3.

Q: Why were the weapons of the terrorists on September 11, 2001, not detected before the terrorists entered the airplanes? Are there any major conspiracy theories about 9/11 that are not completely debunked? Are there any good books or documentaries that completely debunk every major conspiracy theory about 9/11? F.S., Cologne, Germany

A: In that time, airport officials were on the lookout for bombs and guns, since those could be used to destroy a plane in midair. They did not take much interest in bladed weapons, since those were deemed to be no threat to the integrity of the plane. Bladed weapons—knives and box cutters—were what the 9/11 hijackers were carrying. Further, the standard protocol of that time was to cooperate with hijackers in order to keep the passengers safe. It had not occurred to anyone that hijackers might be willing to sacrifice their lives in order to turn the plane itself into a weapon. Obviously, these assumptions have been revisited and corrected for, and it would not be possible to duplicate 9/11 today.

As to conspiracy theories, we note above that there are none that are credible. If you want to know more, there is the 9/11 commission report, or this podcast, entitled "Conspiracy Theories And The Sept. 11 Terrorist Attacks", which runs down and debunks the major conspiracy theories, or this documentary from PBS, entitled "Debunking 9/11 Bomb Theories", or this book, entitled Debunking 9/11 Myths: Why Conspiracy Theories Can't Stand Up to the Facts .

Q: Your write up of Civil War monuments got me thinking about the way many states require students to learn U.S. History by taking two separate courses. Here in Texas, young scholars are exposed to the first half of U.S. History at whatever tender age they happen to be while in the 8th grade. In theory, this course covers topics from the colonial era through the aftermath of the Civil War (sometime around 1877). The second half of U.S. History comes three years later during their junior year, and supposedly covers 1877 through 2008. Any K-12 teacher can attest to the mad dash through the curriculum that occurs in the final weeks of the course, often giving short shrift to the final unit of study. This results in many students having a poor grasp of topics related to Reconstruction and to...whatever bland name is given to a final unit that covers the "important events of the 1980s, 1990s, and 2000s." As a high school teacher who's taught U.S. History and Government courses over the years, I always find it unfair that the standards and time periods we are expected to cover keep increasing while middle-school teachers keep the same set of topics.

Two questions: If you were in charge of creating curricular standards for U.S. History: Would you prefer requiring three courses to better cover U.S. History topics across all time periods, or would you change the split year between the two courses? What dates/topics would make good endpoints for a 3-course study of U.S. History?
M.M., Houston, TX

A: If (Z) was given dictatorial powers over history education, he'd prefer not to use a narrative approach at all. It's too much information, not enough thinking, and doesn't afford discussion of longitudinal effects. World War I substantially caused the Cold War, but it doesn't work well to point that out in your WW I lecture if you're using a chronological narrative. He would prefer to focus on specific skills, and on in-depth examinations of specific case studies.

That's never going to happen, of course. So if we're going to stick with narrative, then probably better in three parts than two, just to give more time to explain things. And the only breakdown that (Z) can really conceive of is: (1) 16th-18th centuries, (2) 19th century, and (3) 20th and 21st centuries. At least, that's how it was broken up when he took U.S. historiography as a three-quarter course in grad school.

The problem here is that the stuff from colonial and Revolutionary America is far and away the hardest to teach, because it's so foreign to us today. And under the scheme you describe, that would mean that it's in an early grade (4th or 8th) that the students get the early stuff. This is not ideal, and is another argument against chronology.

If you stick with a two-part course, you pretty much have to split it at 1877, because the major storylines of early U.S. history end, and the major storylines of modern U.S. history begin. The other options are 1850, but that would make the second half very busy, or 1896, but that would make the first half very busy.

Q: You wrote: "We trimmed out predictions that have already proven indisputably incorrect." Why? Since you have complained before that pundits are rarely held accountable for bad predictions, I would have expected you to be more scientific and include negative results. S.S. in San Luis Obispo, CA

A: Well, to start, this isn't scientific in any meaningful way. But, in any case, we trimmed them for two reasons: (1) It seemed unfair to do that to the readers who made those predictions, like holding them up to ridicule, and (2) reading already-wrong predictions just isn't that interesting, as it turns out.

Previous | Next

Back to the main page