- Street flooding, damaging winds possible for Houston area Thursday | Timeline
- TIMELINE: Thursday storms could bring street flooding, damaging winds
- TIMLINE: Thursday storms could bring street flooding, damaging winds
- Pender County wildfire now 65 percent contained, still 1,226 acres
- Today on Texas Standard: How Houston is using nature to combat flooding
The big picture on election security in the 2020 campaign after Super Tuesday: Could be worse — but also could be better.
The biggest day of voting so far in this year’s race wasn’t problem-free: Officials dealt with problems in Texas, California and North Carolina, plus tornadoes disrupted the vote in middle Tennessee.
And, as national security officials acknowledged before and during the vote on Tuesday, foreign malefactors continue to try to influence the information environment in the United States via agitation and disinformation on social media.
Even so, Americans appear to have been able to cast a ballot as they wished without major cyberattacks, information dumps or other mischief like that seen in the wave of active measures launched by Russia in 2016. So far.
“Tuesday may have been a success, from the perspective of foreign influence — but folks ought to remain vigilant,” said David Levine, a former elections supervisor who now serves as elections integrity fellow with the Alliance for Securing Democracy, a group in Washington.
The problems and disruptions that took place around the country were connected with elections equipment systems, shortages of poll workers and long lines of voters.
They followed comparatively smooth primaries in South Carolina and New Hampshire, a comparatively smooth caucus in Nevada and the high-profile implosion of Iowa’s caucuses — itself the result of problems with an app used to tally and report caucus results.
With public confidence in elections already shaken by the 2016 experience, even innocent problems or disruptions can have a corrosive effect, Levine said.
Counties and local jurisdictions can buy back goodwill by executing well in states that still are scheduled to vote between now and November — but they also can inadvertently fuel suspicions, he said.
“It’s really important that local elections officials have an opportunity to assess what worked, what didn’t and work assiduously to try and mitigate these issues,” Levine said.
“The perception of interference can be as dangerous as interference — the administration of elections need to be as seamless as possible. Long lines at polling places can mean people choosing to leave lines and not vote, or not vote in future … it has the ability to undermine the democratic process and play into the hands of foreign adversaries.”
The influencing machine
Russian, Chinese, Iranian and potentially other cyber-specialists are interested in influencing or disrupting the presidential election, national security officials warn.
Intelligence officials reportedly briefed members of Congress that Russia’s preferred outcome this year mirrors that of 2016: a Trump victory. But that country’s operatives also are working to boost Democratic hopeful Bernie Sanders; Sanders has acknowledged receiving a protective briefing from federal officials.
Trump and Sanders both say they want no foreign help winning in 2020. What hostile governments wants is less a certain political outcome than “to cause confusion and create doubt in our system,” Secretary of State Mike Pompeo and other officials said on Monday.
Pompeo and the heads of defense and intelligence said they’re responding across the board.
Federal, state and local officials are coordinating in a way they never have before. Big Tech platforms say they’re working within their own networks to clamp down on disinformation and agitation. And political campaigns are trying to make themselves harder targets for cyberattacks than before.
But the enemy also gets a vote, as Pentagon officials like to observe, and officials and observers suggest that influence-mongers are changing their tactics in response to the American countermeasures.
For example, Twitter’s head of site integrity, Yoel Roth, told NPR that his network traces few or no posts directly to Russia as was the case before. Now, he believes, influence specialists are seeking to do more to amplify real divisive material posted by real Americans.
Twitter is deploying a new policy on Thursday aimed at flagging what it considers deceptive material. It and Facebook have faced criticism from members of Congress and outside groups for being too sluggish and too conservative about policing disinformation.
Both companies say they’re trying to balance what they call their values and practices with the need to be responsible and help the government.
They’ve both made regular reports of expunging numbers of fake accounts they say are connected with influence activity; one page that Facebook deactivated last month had more than 50,000 followers — although the company said it wasn’t connected with foreign interference.
Many questions, however, remain.
Are more clandestine activities taking place out of sight? If national security leaders are correct and more nations than Russia are attempting to influence Americans — what novel tactics are now in store? And can a lower but still steady volume of disinformation and agitation still have an effect?
“If you can look like a million humans, what can you do?” asked Tamer Hassan, co-founder and CEO of White Ops, a cybersecurity company focused on understanding and countering bot and inauthentic activity. “The answer is a lot of different things.”
The effects of what Hassan call “computational influence” can be vast, he said.
Creators of malware establish huge networks of compromised systems that they can use, or lease to others, to influence the apparent popularity of material online.
There are bots that can help with financial fraud, bots that can elevate social media posts and even bots designed to listen to music over streaming services to make it appear more popular, he said.
Election security observers said disclosures like those made by national security officials about continued interference activity are constructive, but the implications of being able to influence and shape perceptions online are profound — and that’s not going away.
“Awareness is always the first step, but we are a long way from solving the problem and wrapping our arms around it,” Hassan said.
“It’s a difficult problem to solve because we are built psychologically to influence each other. Often, popularity and trends matter. It helps us make decisions. Advertising or political beliefs and political groups like our own democracy are based off of some level of influence and espousing ideas — so it becomes a much more difficult problem.”