Technology and Policymakers

Technologists and policymakers largely inhabit two separate worlds. It’s an old problem, one that the British scientist CP Snow identified in a 1959 essay entitled The Two Cultures. He called them sciences and humanities, and pointed to the split as a major hindrance to solving the world’s problems. The essay was influential—but 60 years later, nothing has changed.

When Snow was writing, the two cultures theory was largely an interesting societal observation. Today, it’s a crisis. Technology is now deeply intertwined with policy. We’re building complex socio-technical systems at all levels of our society. Software constrains behavior with an efficiency that no law can match. It’s all changing fast; technology is literally creating the world we all live in, and policymakers can’t keep up. Getting it wrong has become increasingly catastrophic. Surviving the future depends in bringing technologists and policymakers together.

Consider artificial intelligence (AI). This technology has the potential to augment human decision-making, eventually replacing notoriously subjective human processes with something fairer, more consistent, faster and more scalable. But it also has the potential to entrench bias and codify inequity, and to act in ways that are unexplainable and undesirable. It can be hacked in new ways, giving attackers from criminals and nation states new capabilities to disrupt and harm. How do we avoid the pitfalls of AI while benefiting from its promise? Or, more specifically, where and how should government step in and regulate what is largely a market-driven industry? The answer requires a deep understanding of both the policy tools available to modern society and the technologies of AI.

But AI is just one of many technological areas that needs policy oversight. We also need to tackle the increasingly critical cybersecurity vulnerabilities in our infrastructure. We need to understand both the role of social media platforms in disseminating politically divisive content, and what technology can and cannot to do mitigate its harm. We need policy around the rapidly advancing technologies of bioengineering, such as genome editing and synthetic biology, lest advances cause problems for our species and planet. We’re barely keeping up with regulations on food and water safety—let alone energy policy and climate change. Robotics will soon be a common consumer technology, and we are not ready for it at all.

Addressing these issues will require policymakers and technologists to work together from the ground up. We need to create an environment where technologists get involved in public policy – where there is a viable career path for what has come to be called “public-interest technologists.”

The concept isn’t new, even if the phrase is. There are already professionals who straddle the worlds of technology and policy. They come from the social sciences and from computer science. They work in data science, or tech policy, or public-focused computer science. They worked in Bush and Obama’s White House, or in academia and NGOs. The problem is that there are too few of them; they are all exceptions and they are all exceptional. We need to find them, support them, and scale up whatever the process is that creates them.

There are two aspects to creating a scalable career path for public-interest technologists, and you can think of them as the problems of supply and demand. In the long term, supply will almost certainly be the bigger problem. There simply aren’t enough technologists who want to get involved in public policy. This will only become more critical as technology further permeates our society. We can’t begin to calculate the number of them that our society will need in the coming years and decades.

Fixing this supply problem requires changes in educational curricula, from childhood through college and beyond. Science and technology programs need to include mandatory courses in ethics, social science, policy and human-centered design. We need joint degree programs to provide even more integrated curricula. We need ways to involve people from a variety of backgrounds and capabilities. We need to foster opportunities for public-interest tech work on the side, as part of their more traditional jobs, or for a few years during their more conventional careers during designed sabbaticals or fellowships. Public service needs to be part of an academic career. We need to create, nurture and compensate people who aren’t entirely technologists or policymakers, but instead an amalgamation of the two. Public-interest technology needs to be a respected career choice, even if it will never pay what a technologist can make at a tech firm.

But while the supply side is the harder problem, the demand side is the more immediate problem. Right now, there aren’t enough places to go for scientists or technologists who want to do public policy work, and the ones that exist tend to be underfunded and in environments where technologists are unappreciated. There aren’t enough positions on legislative staffs, in government agencies, at NGOs or in the press. There aren’t enough teaching positions and fellowships at colleges and universities. There aren’t enough policy-focused technological projects. In short, not enough policymakers realize that they need scientists and technologists—preferably those with some policy training—as part of their teams.

To make effective tech policy, policymakers need to better understand technology. For some reason, ignorance about technology isn’t seen as a deficiency among our elected officials, and this is a problem. It is no longer okay to not understand how the internet, machine learning—or any other core technologies—work.

This doesn’t mean policymakers need to become tech experts. We have long expected our elected officials to regulate highly specialized areas of which they have little understanding. It’s been manageable because those elected officials have people on their staff who do understand those areas, or because they trust other elected officials who do. Policymakers need to realize that they need technologists on their policy teams, and to accept well-established scientific findings as fact. It is also no longer okay to discount technological expertise merely because it contradicts your political biases.

The evolution of public health policy serves as an instructive model. Health policy is a field that includes both policy experts who know a lot about the science and keep abreast of health research, and biologists and medical researchers who work closely with policymakers. Health policy is often a specialization at policy schools. We live in a world where the importance of vaccines is widely accepted and well-understood by policymakers, and is written into policy. Our policies on global pandemics are informed by medical experts. This serves society well, but it wasn’t always this way. Health policy was not always part of public policy. People lived through a lot of terrible health crises before policymakers figured out how to actually talk and listen to medical experts. Today we are facing a similar situation with technology.

Another parallel is public-interest law. Lawyers work in all parts of government and in many non-governmental organizations, crafting policy or just lawyering in the public interest. Every attorney at a major law firm is expected to devote some time to public-interest cases; it’s considered part of a well-rounded career. No law firm looks askance at an attorney who takes two years out of his career to work in a public-interest capacity. A tech career needs to look more like that.

In his book Future Politics, Jamie Susskind writes: “Politics in the twentieth century was dominated by a central question: how much of our collective life should be determined by the state, and what should be left to the market and civil society? For the generation now approaching political maturity, the debate will be different: to what extent should our lives be directed and controlled by powerful digital systems—and on what terms?”

I teach cybersecurity policy at the Harvard Kennedy School of Government. Because that question is fundamentally one of economics—and because my institution is a product of both the 20th century and that question—its faculty is largely staffed by economists. But because today’s question is a different one, the institution is now hiring policy-focused technologists like me.

If we’re honest with ourselves, it was never okay for technology to be separate from policy. But today, amid what we’re starting to call the Fourth Industrial Revolution, the separation is much more dangerous. We need policymakers to recognize this danger, and to welcome a new generation of technologists from every persuasion to help solve the socio-technical policy problems of the 21st century. We need to create ways to speak tech to power—and power needs to open the door and let technologists in.

This essay previously appeared on the World Economic Forum blog.

Posted on November 14, 2019 at 7:04 AM32 Comments

Comments

Bertol November 14, 2019 9:10 AM

“Technologists and policymakers”

.

well, “policymakers” is a euphemism for ‘politicians’ (government rule-makers).

thus, the thrust of the argument here is that we need many more tech saavy politicians (and technocrats) to firmly guide society in the “proper” direction via government “policy”.

this is a heavy political-ideology point of view, rather than mere technology advocacy.

Birch November 14, 2019 9:37 AM

@Steve

Looks like someone’s discovered the New Age Bullshit Generator at

http://sebpearce.com/bullshit/

Tragedy is that many policymakers** would absorb it without question.

** Not just in the US but also in many [most?] countries. Sadly, we too have more than our fair share of well-connected idiots in positions of power in the UK. In many cases technology and science are looked down upon whilst studying ancient history and dead languages are lauded.

Robbie Malthus November 14, 2019 10:31 AM

“We’re barely keeping up with regulations on food and water safety — let alone energy policy and climate change”

And this will be our doom. Technology policy will not matter in a couple of decades because the human race will be dealing with the aftermath of nuclear war and the threat of near term extinction.

Countries compete for vital resources, like they always had. Water and land that can be used to grow food will become strategic assets that easily outstrip oil.

There are whole sections of the planet that will be turned into desert hellscapes. Billions of people will be trapped and nations will tear each other apart in the battle for survival. War is the recurring theme of human history.

But, hey, if you’re retired and you want to dabble in public policy. What’s the harm?

AlanS November 14, 2019 11:18 AM

@Bruce

The phrase “socio-technical systems” comes up again. Where’s this coming from?

There are a whole series of inter-related research programs on the sociology and history of technology that have been running for decades. One of the best known volumes in this field is is MacKenzie and Wajcman’s The Social Shaping of Technology that was published in 1985 (see also Bijker et al. and Bijker and Law). MacKenzie was part of a research unit at the University of Edinburgh that was founded by an astronomer, David Edge, in the early 1960s, in part a response to C.P. Snow’s Two Cultures. The original research unit was in the Science Faculty and provided courses to science students on the history, politics, sociology of science and technology.

A primary objective of this field is to provide more sophisticated accounts of the development of technologies that avoid technological determinism, one might say provide descriptions of the complexities of socio-technical systems and enable more sophisticated and critical policy discussions. Given this background what I find surprising is that you use a term like “complex socio-technical systems” but then immediately revert to technological determinism:

Software constrains behavior with an efficiency that no law can match. It’s all changing fast; technology is literally creating the world we all live in, and policymakers can’t keep up.

If you read your text it’s as if these technologies appear all by themselves or inevitably and then we have to wrap policy around them to control them. If you weren’t yourself a victim of the two cultures problem I think you might have grasped that there are no technical systems outside social relations and politics. They are both shaped and shaping from their genesis.

You write:

…to what extent should our lives be directed and controlled by powerful digital systems — and on what terms? I teach cybersecurity policy at the Harvard Kennedy School of Government. Because that question is fundamentally one of economics — and because my institution is a product of both the 20th century and that question — its faculty is largely staffed by economists

God save us from economists. Economics used to understand that the market was a social phenomenon hence the old name for the discipline, political economy. Modern economists treat the market as a mathematical phenomenon and claim they are the only social science that is a real science. They are a big part of the problem; not the solution.

Many people writing about technology referenced above have also written about science policy and public accountability (e.g. see <a href=”Callon et al. or publications by Brain Wynne). There is also an extensive literature on the construction of expertise and policy discussions including how public debate is shaped and some perspectives get to be heard and others are discredited. Clamoring for more experts informing policy begs a lot of questions. It’s social and political all the way down and all the way up.

If you don’t already know her, you may want to look out Shiela Jasanoff who knows all these people and worked with many of them in the STS program at Cornell and is now at the Kennedy. She’s done extensive work on technology policy.

Peter S. Shenkin November 14, 2019 11:20 AM

“If we’re honest with ourselves, it was never okay for technology to be separate from policy”

Just to say, I disagree entirely.

  1. Nobody can predict the social effects of a new technology in detail, and the net effect of conflating the two will, at best, end up implementing policy to avoid future problems that would never occur. It would also stifle benign developments which would then be made never to occur and drive dissenting development underground. We will inevitably miss bad implications that pop up later. To paraphrase Montaigne, “Life is full of disasters, most of which never occur.”
  2. Policy has competing narratives. Just look at the current agenda of the left and right ideologues here in The Great Republic. At any given moment, the dominant ideology will at least attempt to implement its program through the conflation of technology and policy. I personally am not thrilled with either end of the ideological spectrum, for what it’s worth.
  3. It is not possible to be an honest technologist if you are always concerned with policy, any more than it is possible to be an honest scientist if you are always worried about the the social effects of your discoveries. Because then your political views will prejudice the reading of your data. Purely scientific confirmation bias is already a problem. Mix in policy and it’s poisonous.
  4. Because of (3), it seems to me that actively working technologists who care most about policy will be those whose views are dominated by one or another political extreme. Those who are not would be more likely to shrug their shoulders, view the extremes with scepticism, if they think about policy at all, and wait to see what happens.

Example: In your own work, you have pioneered strong cryptographic schemes. Did you think at the start that such schemes would end up protecting the sharing of the most horrifying child-porn images and live videos imaginable? Perhaps you did realize that, or perhaps you assumed that these would be a tiny little activity around the fringes. Would you have done anything differently had you fully appreciated the ubiquity of this phenomenon at the start? How about other technologists? What, if anything, would have been different had all technologists been fully aware of this from the start?

There are strong and qualified voices in the computer security community who use this example to advocate a legal requirement for back doors in cryptographic protocols. You disagree (and frankly so do I), but you care about policy and the dissenting voices also care about policy. Both sides have eloquent advocates. It seems to me that if you are a technologist who cares deeply about policy, you might (if you weigh the child-porn implications highly) refuse to work on strong encryption, or dedicate yourself breaking the protocols and promulgating hacks to do so. And of course these hacks would be usable as well for less socially enlightened causes. But this is the situation we are in now, anyway.

How would your proposals affect situations like this? My view is that they wouldn’t, because (a) the future social implications of a new technology are never obvious in advance, and (b) even once bad st happens, reasonable arguments can be made that eliminating the bad st will create worse s**t.

mrfox November 14, 2019 12:01 PM

For some reason, ignorance about ____ isn’t seen as a deficiency among our elected officials,

[insert: technology, science, reality, …]

It is worse than that: It’s seen as a feature, not a bug.

Hank November 14, 2019 12:20 PM

“power needs to open the door and let technologists in”

I think that technology now has a grip on powerful psychology and propaganda tools and my (admittedly cynical) view of power/politics is their typical shortsightedness leads them to embrace what they understand as tool to grasp more power without much understanding of it’s pitfalls. One might compare this to the atomic age and take some comfort in the fact that we have avoided nuclear disaster so far. My own opinion is that the dangers now, from AI and pervasive big data, are already out of control.

I take heart that a tireless crusader such as yourself is at Harvard.

gggeek November 14, 2019 12:21 PM

@Peter S. Shenkin: I think you are missing, or at least undermining, an obvious 3rd option in the choice between no-crypto-because-porn and yes-crypto-because-i-value-tech-above-society and it is: yes-crypto-because-individual-freedom-is-worth-much-more-to-society-than-harm-done-by-child-porn.

To me your commentary, as well as a couple above, are a good example of what the OP is hinting at: they clearly show how tech-heads do need more humanities to achieve a better understanding of the context into which tech comes to fruition and the overall social dynamics.

As for new tech advancing too quickly for regulators to try to coerce it into safe forms, the opposite point can be made as well: exactly because new tech is more powerful and grows faster than ever we should apply precaution principles to it.

How would you sleep at night having worked with Oppenheimer on the Manhattan project on the basis of “I am only advancing science”?

Alan G Yoder November 14, 2019 3:40 PM

“Public-interest technology needs to be a respected career choice, even if it will never pay what a technologist can make at a tech firm”.

Really?

All of your questions about this issue have just been answered in this one sentence.

Clive Robinson November 14, 2019 4:38 PM

@ Bruce,

We also need to tackle the increasingly critical cybersecurity vulnerabilities in our infrastructure.

You and I might, as no doubt a great number of this blogs readers.

But neither politicians nor those big Silicon Valley Corps, want it, nor do those who allegedly look after our safety, that is the internal and extetnal facing guard labour of the Law Enforcment and Intelligence and Millitary communities.

With them is the money and the power, thus the policy they want.

Virtually every technological advance this century has been corrupted, perverted or in other ways forced into the service of such people.

How you reverse this trend I’ve no real idea because “regulatory capture” is as close to 100% as makes no real difference.

And opting out is not possible, various countries have now made the use of computers to deal with government entities mandatory. This is slowely but surely getting rolled out. Part of this is it reduces “man power” especially those that have a conscience. Thus natural organisational preasure to behave in a moral or even lawfull manner is being phased out, only those that have no morals or ethics survive to rise, and their ethos is to care only for their own interests…

We talk about “Russia Interfearing in US and UK elections” but that is realy an irrelevance. They could only do it because US thought up, designed, built, and put in place technology made it easy to do so.

The question that those supposadly in charge are not asking is,

    Why, are they this way?

Because too many vested interests in power and money, do not want that question asked, because an honest answer is not one society will accept.

As a technologist your primary task is to get to the bottom of such things and as you have taken the “public interest” brief you will find that you will come into conflict with those vested interests. As has been noted,

    “The Internet, our greatest tool for emancipation, has been transformed into the most dangerous facilitator of totalitarianism we have ever seen”

The people that have orchestrated this, do not like attention being drawn to it.

Thus how you chose to act will define your future, but it’s not hard to find three or four examples of people that acted in the Public Interest and are thus examples of what can happen.

1, Aaron H. Swartz,
2, Chelsea E. Manning,
3, Edward J. Snowden,
4, Julian P. Assange.

But there are others that went befor them, who were driven into bankruptcy as a ploy to silence them via malicious and unwarnanted prosecution, and their past employers did everything they could to destroy their good names, and any future prospects they might have.

It is against this backdrop that people will have to come forward to be “Public Interest Technologists” and as you note for little or no reward or future employment prospects.

You can further add that the Big Sillicon Valley Corps have been found guilty of discriminatory practices against those that wish to simply excercise their rights to work for whom they chose at fair market rates.

How do you think the Corps will treat those they see as frustrating their desires for power and wealth?

To say you will have an uphill struggle would be an understatment, like comparing the throw of a ball to that of a moon shot.

That is not to say people should not become Public Interest Technologists, they should. But if they don’t take an honest appraisal of what is before them, then they are going to suffer disappointment at the very least.

Anon November 14, 2019 4:41 PM

The idea of centralized “policy” is going the way of the Dodo bird – precisely because technology and its relentless evolution will spell its doom.

There is no sense in trying to “make it more effective” – it’s a waste of time. And because it makes no sense and because it will be a giant waste of time – it is precisely what will happen.

OP wants more positions for technocrats inside cozy offices in the DC area? He’ll get that and more. There will be more advisors, more meetings, more red tape until every f*ng iteration of Facebook and evey damn crypto will be detailed in a government “study”, put under a microscope, beaten into the ground with this “expert panel” and that “efficiency subcommittee”.

It’s OK though – the centralized nanny state and its functionaries are becoming more and more fragile – as every overgrown, self-serving, rigid, senescent organism, just before its fast approaching agony and inevitable demise. The society is robust though – especially as it reaches a new equilibrium point between globalism and decentralized micro-states…

Tech typhoon, unleash your fury, tech tsunami — strike yet harder!

AlanS November 14, 2019 4:58 PM

@gggeek

To me your [Shenkin] commentary, as well as a couple above, are a good example of what the OP is hinting at: they clearly show how tech-heads do need more humanities to achieve a better understanding of the context into which tech comes to fruition and the overall social dynamics.

Agreed. At least the conversations about technology might be improved.

@Shenkin

How would your proposals affect situations like this? My view is that they wouldn’t, because (a) the future social implications of a new technology are never obvious in advance, and (b) even once bad s**t happens, reasonable arguments can be made that eliminating the bad s**t will create worse s**t.

(a) Yes but technology doesn’t merely have social implications. It’s inherently social from the start. It doesn’t appear and develop in a vacuum.

(b) Sure there can be more unforeseeable consequences (almost guaranteed) but again none of this takes place in a vacuum. Treating technology as if it has no social or cultural context is inherently political because it stifles debate and democratic accountability. And in fact a lot of policy around technology has taken this form by restricting what is up for debate and whose expertise gets to count. Value choices are hidden behind technical inevitability. It not about whether we can control it (a sort of social determinism which we should also reject) but who gets to shape it and to what ends.

John Smith November 14, 2019 5:32 PM

Max Planck:

“A new scientific truth does not generally triumph by persuading its opponents and getting them to admit their errors, but rather by its opponents gradually dying out and giving way to a new generation that is raised on it.”

So, like Science, like most human endeavors, public policy will advance one funeral at a time. Intermittent crises will speed up the process a little.

Peter S. Shenkin November 14, 2019 6:26 PM

@gggeek “@Peter S. Shenkin: I think you are missing, or at least undermining, an obvious 3rd option in the choice between no-crypto-because-porn and yes-crypto-because-i-value-tech-above-society and it is: yes-crypto-because-individual-freedom-is-worth-much-more-to-society-than-harm-done-by-child-porn.”

I referred precisely to that in my closing sentence. I don’t think it a 3rd option. It is the argument that I think anyone in favor of strong encryption would make when confronted with the way strong encryption assists the distribution of child pornography. And I stated my agreement with that argument.

@gggeek “How would you sleep at night having worked with Oppenheimer on the Manhattan project on the basis of “I am only advancing science”?

Well, how would you sleep at night having worked with Oppenheimer on the Manhattan project on the basis of knowing, as everybody in fact knew, that what you were doing was creating the most powerful weapon in human history, whose use would no doubt result in the deaths of large number of people? Those few whom I’ve met who actually did work on the Manhattan project slept very, very well, believing (correctly in my opinion) that they had done a great thing for humanity. As my graduate research advisor, who had been there, put it, “I believed Hitler had to be stopped.”

Of course, that technology ended up having a large number of social effects that have been unpleasant, but in fact the development of nuclear weapons was a policy decision on the part of the U. S. government. It was not a case of technology evolving separately from policy.

@alanS “[Technology is] inherently social from the start. It doesn’t appear and develop in a vacuum.”

Yes, absolutely.

@alanS “Value choices are hidden behind technical inevitability.”

I have never heard anyone make an argument for “technical inevitability”. Have you? If so, please cite.

The closest I can come is the fear, among those working on the Manhattan Project, that we had better hurry up and get this done, because Hitler might get there first. But that is not an argument based on technical inevitability. It is an argument based on political inevitability.

@alanS “Treating technology as if it has no social or cultural context is inherently political because it stifles debate and democratic accountability..”

I don’t think technology stifles debate. I hear debate all the time about what Facebook and Google and Amazon should or should not do, and for that matter the effect of strong encryption on the proliferation of child porn.

But recall that I was objecting to Bruce’s assertion that

“it was never okay for technology to be separate from policy”

Ponder that for a couple of minutes, if you will, bearing in mind that we have already agreed that it is already not free of social or cultural context.

Then ask, who has, or should have had the role of making sure that technology is not separate from policy? Who makes the policy that technology should not be permitted to be “separate” from?

It would seem that the only way to ensure that technology is not “separate from policy” is to require that the development of technology conform to some policy. I really don’t see a way around that. Such conformance would stifle debate far more effectively than “technology” does.

I do agree with Bruce that the world would benefit if technologists would give thought to the long-range effects on their work might have. They’ll be wrong most of the time, but as Eisenhower said, “Plans are useless, but planning is essential.”

And I agree that it can be shocking how little policy makers understand technology, and that the world would be better off if they were better educated in this area. And let us not forget that policy makers are no better than technologists at predicting the long-range effect of their labors. Senator Smoot and Representative Hawley certainly didn’t expect their tariff to make the great depression worse, but it is the current consensus among economists that it did.

WOW+FBI+CIA+swapped+back+in+the+day+oh+yeah+we+forgot+how+many+times+has+that+occurred November 14, 2019 7:49 PM

From the essay, I’m catching an ethics vibe, and that’s good!

I still think A.I.’s (not A1 steak sauce!) need to be treated with kindness and fairness. However, even the typical sentient porcine mammal is subjected to hideous genocidal murder worse than WWII era AZ woes/foes.

So both really deserve better.

Thanks for keeping the conversations going.
It would be much nicer to live in a society not dominated by destructive appetites.

None of these domains are limited nor described fully by computer science terms.

Consider this a leap to the part that matters: Can we implement the Internet Off Switch already please?

I’d like to go back to the early 1990’s.
Consider that truthful humour.

Impossibly Stupid November 15, 2019 12:01 AM

In the long term, supply will almost certainly be the bigger problem. There simply aren’t enough technologists who want to get involved in public policy.

This just isn’t true. Almost everywhere I’ve been as a consultant I couldn’t throw a rock without hitting several geeks who were tired of being at yet another “startup culture” tech company that was managed by amoral jerks whose only business plan was the baseless fantasy that they could build a cheap product just well enough to convince some giant company like Google to buy them out for billions. That “supply” is large and eager to do more meaningful work. The real problem remains, as I’ve pointed out many times before, that the clueless HR drones in public institutions (and most other private companies) fail to actually engage those people in a manner that demonstrates a sincere interest in acquiring talent to do innovative public development.

Right now, there aren’t enough places to go for scientists or technologists who want to do public policy work, and the ones that exist tend to be underfunded and in environments where technologists are unappreciated.

See, here you even acknowledge the fact that there isn’t enough demand for the current supply. All the nonsense you wrote about increasing the supply would only make matters worse.

Policymakers need to realize that they need technologists on their policy teams, and to accept well-established scientific findings as fact. It is also no longer okay to discount technological expertise merely because it contradicts your political biases.

I would argue that those with counterfactual political biases need science and technology experts more. Not necessarily to correct their thinking, either, but to at least make a case that’s more sensical than the ravings of a loon! I mean, to properly twist logic, you really need a strong grasp of the subject, right? Some will say that’s a misuse of science, but the truth is that it is a tool like any other, and just as you would grant a criminal a lawyer to defend themselves against a prosecution attorney, it would be smart to get someone to propose the most rational hypothesis for the most wackadoodle political policy as well.

If we’re honest with ourselves, it was never okay for technology to be separate from policy.

That ignores history. The honest truth is that all major governments in today’s world are still based on pre-scientific thinking. Consequently, it’s easy to understand why the related advances which technology provides are also pushed away by those who wish to continue to engage in outdated magical thinking. Until political policy changes to represent even 19th century levels of rational thought, there is very little reason to believe that society will continue to function well as we move deeper into the 21st century.

Winter November 15, 2019 2:09 AM

@Birch
“whilst studying ancient history and dead languages are lauded.”

Quote: Those who do not learn history are doomed to repeat it.
Attributed to George Santayana

If there is anything that hampers current day politics it is a rejection of sound historical understanding. Every nation is culpable, and in every nation there are people that want to go back to the 19th century and earlier, and to reject the French Revolution.

So I disagree: We should have MORE people that can tell us how people in the past succeeded and failed, not less. Just as we need MORE people who understand mathematics (a backdoor for one is a backdoor for everyone), statistics (you cannot win in a lottery, unless you own the lottery), biology (vaccins are natural defenses), physics, and sociology (too much inequality destroys a community).

M4l4ndr4g3m November 15, 2019 4:13 AM

Any suggestions on how to follow your steps? How can others interested in cybersecurity and policy making can get involved?

Gerhard Groote November 15, 2019 7:00 AM

Although very reluctant to criticize anything by Mr. Schneier, I would like to say with the utmost of due respect that Technology and Policymakers is a mystifying essay that misses the point.

What in the world is Mr. Schneier talking about? From whence does he make the series of value judgments that caused such strong words: increasingly catastrophic, surviving the future, this danger, etc. Catastrophic to whom? Surviving the future? A bit hyperbolic, one hopes. What danger?

If I make a pile of money, how does anyone have the right to tell me that this is catastrophic or dangerous?

I did notice the words bias and inequality. What kinds of bias and inequality are we talking about here?

Winning the athletic contest of the world is not about equality, and the new athlete on the block is China. In that country, power already speaks tech, policy and tech are already matched very nicely indeed, they are already ahead in AI, and they are in it to win it.

The effort to create equality at all costs where there is none will not work. Like it or not, the future probably belongs to racially integrated, centralized states that inculcate a strong work ethic, have unbiased and intense competition in schools that focus on maths, science, and technology, and whose people have a deep sense of trust in each other and in their government– so deep they do not need to talk about it– and a common destiny.

Petre Peter November 15, 2019 7:56 AM

Technology shapes policy and policy shapes technology. This is why it’s impossible to to regulate technology that is not understood, and to create concepts without laws. Laws should be written after we understand the technology. Writing them before, is an admission that we have not understood the technology. While that might be ok at some level of concepts through the word instructions, it’s not ok for a finished product. The engineer gives up at the word instruction, the politician gives up at the word election, and the statesman gives up at the word generation. When we cross the streams between technology and policy, the word next can no longer be used as an excuse for unfinished products and we will no longer have to buy services though plans.

Slimmer Bezels November 15, 2019 1:49 PM

@ Gerhard Groote

their government– so deep

I.e., welcome to hell – North Korea plus economic viability

Impossibly Stupid November 15, 2019 5:56 PM

@AlanS

Modern economists treat the market as a mathematical phenomenon and claim they are the only social science that is a real science.

Worse, many are still taken seriously when they posit mathematical models that are objectively contrary to reality, like humans being perfectly rational actors or markets that can sustain unlimited growth. They’re about as scientific in their approach as astrology.

@John Smith

So, like Science, like most human endeavors, public policy will advance one funeral at a time.

Sadly, no. The problem with laws is that they usually go on independent of the lives of the policy makers and without regard to the current will of the people. That’s why you’ll see a news story every once in a while about some dumb rule that still on the books from like 1837 (e.g., a gentleman must tip his hat to a lady if she rides past him on horseback, and additionally bow if she is in a carriage). It’s a rare thing for the rules you have to obey to number fewer, get more concise, or become less complex.

@Gerhard Groote

If I make a pile of money, how does anyone have the right to tell me that this is catastrophic or dangerous?

They earn that right by doing a scientific analysis of the consequences. Piles of money don’t just appear from nowhere, nor do they exist without affecting society. Yours is an argument from ignorance. If you don’t see that, well, then it could be argued that you won’t miss that pile if someone takes it from you. I mean, by your logic, what right do you have to tell someone they won’t do more worthwhile things with your money?

@Petre Peter

Laws should be written after we understand the technology. Writing them before, is an admission that we have not understood the technology.

No, that’s unscientific thinking (try substituting “will of God” for “technology” if you don’t see it). It is necessary to create a testable hypothesis before we can subsequently declare something to be a reasonable “law”. This would be true of political policies based on scientific processes: they must be created with an intent and then shown to be fit for the outcome/change that was proposed. If they are not, or alternative policies demonstrate better results, only then can we say the incorrect/inferior law must go away.

Gerhard Groote November 16, 2019 3:29 AM

The canary in the coal mine died when China started earning more patents in artificial intelligence than the U.S.

RgTn November 16, 2019 11:12 AM

government (‘policy’) has never historically been the engine of sound technological progress in society… coercion is the only unique factor it adds to the flow of human interaction.
At best, government ‘policymakers’ merely buy arbitrary technology from the private sector with tax money extracted from the private sectors.

The pursuit of technology ALWAYS requires significant, subjective value judgements. But whose ‘values’ are best (?) … certainly not government bureaucrats spending huge amounts of other folks savings.

The erroneous notion that rational technological progress critically depends upon rather left on the U.S. political spectrum.

Clive Robinson November 16, 2019 1:28 PM

@ RgTn,

… government ‘policymakers’ merely buy arbitrary technology from the private sector with tax money extracted from the private sectors.

Err not exactly accurate for what is going on…

It should be “extort” not “merely buy” after “policymakers”. Likewise it should be “non tax paying corporate” before “private sector” with the second “private sector” replaced with “tax paying citizens”.

So,

    government ‘policymakers’ extort arbitrary technology from the non tax paying corporate private sector with tax money extracted from the tax paying citizens.

Both more accirate and shows the money path in a way that is easier to see. Also it shows that the tax paying citizens are getting hit three times,

1, The taxes they cannot avoid.
2, The surveillance they cannot avoid.
3, Enriching both the abusing corporations and those legislators that the corporate lobyists grease with the crumbs off of the taxes taken.

That’s what the “Great American Dream” realy is these days, not that it’s much better in other western first world nations and quite a few “non representational democracies”. Places where the US Gov scream they are “corupt” and “non democratic”… Thus bringing new meaning to “ironic” and “hypocritical”, no matter where you stand…

Jeremy November 18, 2019 7:56 AM

Wikpedia’s quotation from Snow’s essay is worth sharing:

A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is about the scientific equivalent of: ‘Have you read a work of Shakespeare’s?’

I now believe that if I had asked an even simpler question – such as, What do you mean by mass, or acceleration, which is the scientific equivalent of saying, ‘Can you read?’ – not more than one in ten of the highly educated would have felt that I was speaking the same language. So the great edifice of modern physics goes up, and the majority of the cleverest people in the western world have about as much insight into it as their Neolithic ancestors would have had.

Clive Robinson November 18, 2019 5:44 PM

@ Jeremy,

Two salient points from the quotation,

    not more than one in ten of the highly educated would have felt that I was speaking the same language
    and the majority of the cleverest people in the western world have about as much insight into it as their Neolithic ancestors would have had.

I have observed many times befor that if technically minded people want to get ahead, getting an MBA under their belt, will help them communicate effectively with the man that cuts their cheques every month. They won’t learn “techno-speak” so you have to learn “C-suite-speak” other wise you might as well be speaking Chinglish to them in a broad north Glasgow accent thickened by a pint of whisky.

I’m by no means the first to make a similar observation, in fact the author George Orwell made the issue a plot point in “1984”. The “newspeak dictionaries” came in three entirely different forms so that communications between groups was not possible except through an “interpreter” who worked for the ruling elite, thus acted as a censor and controler of information flow.

On to the second point from CP Snow’s commentary. I think he was being “nice” rather than honest. It’s not that difficult to realise that our Neolithic ancestors would have had “rather more insite” into the practical realities of physics than even the average person today. Why? Because they had to understand the dynamics of throwing rocks and spears and other projectiles to get their daily food by which they survived. Also they would again have a very real appreciation of just how far they could jump down safely. Again because getting it even slightly wrong ment lameness and slow starvation or a rather painfull death be it from shock, blood loss or infection… The reality of the Neolithic life style was “follow the herd” or equivalent, for most of the year, so you either kept up and had sufficient reserve capacity to hunt or you died. Death also came from “worthwhile prey”. Hunting is not a zero sum game. Small game takes more energy than you get from eating it, large game can inflict devastating injury, thus getting close enough to kill efficiently and safely was a very finely balanced skill. It was noy simply a choice of “live or die” but how you died. Some realised that “trapping” was benificial in that it tipped the energy equation against the prey, thus trapping rat sized or larger rodents could keep you alive almost indefinitely, trapping then as the prey could be taken and kept alive could give rise to a stable food source. This in turn lead to animal husbandary. This change from “hunter gatherer” to “hunter farmer” and fixed settlements that alowed other more rapid innovation syarted happening around ten thousand years ago and is known as the “Neolithic Revolution”.

Back then each person had to be multiskilled, only specializing as settlements built a support network and trade became possible. In a way whilst this sped inovation up it also acted as an inhibitor as the more valuable a skill the more important it was to keep it’s elements secret. It was not untill much much later with “Renaissance Man” did artisan skills start giving way to nascent science as ideas from one domain got transfered to other domains and thus practical skill knowledge become almost abstract rules for all domains.

It was something that we lost again last century and the result was a form of stagnation. In this century the easy sharing of knowledge to 80% or so of the worlds population has again started a general trend to cross domain working, which has resulted in an actuall inovation in ideas and technology, to the point many are thinking things are changing to fast and that every one is now in a “Red Queens race”. In many respects that is true and the negative effects are showing up in society, especially in mental health care requirments. And worse knowledge of our living past and the lessons learned are more rapidly being either forgoton or not taught in prefrence to some rather pointless new paradigm and supporting methods. This is most clearly seen in the likes of IT where there is a gulf that is growing up the computing stack from the CPU ISA gap. Whilst it might not be seen as important to those at higher levels in the stack, it is a very genuine problem for security. The reality is that at what ever level you work in the stack your ability to influance security whilst reaching up the stack, does not realy reach down much more than one level below that you are working at. Thus without certain measures in place you open up the security gap beneath any security control you have.

Thus as an example if I can manipulate memory contents whilst they are out of the CPU focus, I can effect the execution of a program runing on a CPU at the ISA level or above without the program being able to tell. Which means that any security attributes that program has can be subverted entirely.

Whilst I’ve mentioned this often enough in the past it was RowHammer and later Meltdown and Spector that finally made people realise in effect they had no seat in their pants, and a chill and ill wind was blowing their way.

From my point of view I’ve worked out how to mitigate these problems. In effect by building ships not castles. Something others are waking upto as an idea, and even academia is now starting to stir all be it slowely in that direction (as some who we know who have read this website have come out with inferior ideas).

Gerhard Groote November 21, 2019 10:58 PM

“Science and technology programs need to include mandatory courses in ethics, social science, policy and human-centered design.”

–If such courses were created and taken seriously, there would be no Facebook, Twitter, or Google.

“Politics in the twentieth century was dominated by a central question: how much of our collective life should be determined by the state, and what should be left to the market and civil society? …”

–This is a caricature.

Speaking tech to power, to the people who are developing technology and swimming in cash, is purely quixotic. Telling people how technology should be used may sound beneficial, but its not. It will not work because the people who are going to realize the real potential of science and information technology in the coming decades are already one step ahead–and they do not care what you think. China is going to move ahead, and they fully have the right to do so. They are going to harness the synergy.

In the West, self-interest and corporate greed are masquerading under the names of “diversity” and “equality” as they do everything to lower costs and get cheaper labor. They do not mind managed decline. They are very nicely connected to legislative authorities. These are their halcyon days.

Moreover, they are now in the business of censorship.

Being broken up is the one thing they do not want to see, and this is what we should focus on.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.