Diversity in science matters to breakthroughs. When more scientists with varied backgrounds and experiences fill laboratories and collaborate on teams, outcomes in innovation and discovery surpass those of less diverse scientific groups, according to the National Institutes of Health (NIH)….
Chin, Chapin Testify to NYS Senate on Cybersecurity
In the wake of the Equifax breach, the New York State Senate turned to Syracuse University’s cybersecurity faculty to learn about cyberthreats, best practices and solutions. In the following select passages of their written testimonies, Professors Shiu-Kai Chin and Steve Chapin share their expertise on the topic.
Passages from Professor Shiu-Kai Chin’s testimony to New York State Senate Public Hearing on Cybersecurity:
“If you treat each item of information as if it were a $100 bill, then you will know what to do. Security and integrity must be built in from the initial concept of a system, into its design and throughout its deployment and operation. This is no different than building and operating any business with the financial controls, constraints and policies to assure that every transfer of funds is authenticated and authorized. The same holds true for information. The gold standard is every transaction must be authenticated and authorized with assurances that if something was done, then whatever was done must have been authenticated and authorized because of the controls and constraints that were built into the system from the start.
There is no integrity or security without audit. What we are talking about is accountability. Information and information operations must be treated with the same care and diligence as we treat money and financial operations. We need to mimic the routine business practice of annual financial audits to assure the public that public statements of a business’ information operations are accurate and reflect reality.
Math is essential. Financial audits rigorously answer the question whether a business’ balance sheet and policies are accurate statements of its financial state and operations. Evidence is gathered, and numbers are crunched. Compelling proof of integrity requires that everything adds up and is balanced. The same is true for information operations. Math is essential for compelling assurance of security and integrity.
What I am saying is not new. The following passage is part of a paper written by Lieutenant Colonel Roger Schell describing remarks by a KGB officer:
“Comrades, today I will brief you on the most significant breakthrough in intelligence collection since the ‘breaking” of the “unbreakable’ Japanese and German cyphers in World War II—the penetration of the security of American computers. There is virtually (if not literally) no major American national defense secret which is not stored on a computer somewhere. At the same time, there are few (if any) computers in their national defense system which are not accessible, in theory if not yet in fact, to our prying. Better still, we don’t even have to wait for them to send the particular information we want so we can intercept it; we can request and get specific material of interest to us, with virtually no risk to our agents. …
“They are aware of the potential for a computer security problem, but with their usual carelessness they have decided not to correct the problem until they have verified examples of our active exploitation. We, of course, must not let them find these examples.”
The above comments are in Roger Schell’s paper “Computer Security: the Achilles’ heel of the electronic Air Force.” The paper was published in Air University Review, in 1979! The paper was reprinted in the 2013 issue of Air & Space Power Journal because of its historical significance.
One takeaway is this: cybersecurity is a known problem, and we have known about it for over 40 years.
Schell wrote his paper in response to the cancellation of his computer security research program by the Air Force in 1979. Our answers to the question “What did we know and when did we know it?” reveal we have no plausible deniability when it comes to cybersecurity. We knew early on this would be a strategic vulnerability and we chose to ignore it. I deliberately say “we” not “they” because even though “we” were not in command in 1979, “we” collectively set the market and the national expectations of what is reasonable, now.
We can no longer ignore the problem. Business as usual will lead to disaster.
The overarching guidance from Schell’s 1979 paper still applies today:
“Do not trust security to technology unless that technology is demonstrably trustworthy, and the absence of demonstrated compromise is absolutely not a demonstration of security.”
The implication is this: penetration testing, while very useful, is insufficient alone for assuring trustworthiness. We need to do the math much like auditors do the math to provide compelling evidence of trustworthiness. We need to verify that the controls and constraints are appropriate for the intended mission. We must verify the controls and constraints are correctly implemented and used properly.
The good news is we have made a lot of progress since 1979. Mathematical verification of computer systems was once thought to be too hard. Semiconductor companies such as Intel routinely integrate formal mathematical verification, simulation and testing to verify their microprocessor chips are correct. Companies, such as Rockwell/Collins, that manufacture onboard flight control computers for commercial airliners do formal mathematical proofs to assure flight control computers are secure.
Syracuse University, in partnership with the Air Force Research Laboratory in Rome, New York, since 2003 has offered the ACE (Advanced Course in Engineering) Cybersecurity Boot Camp. ACE has graduated over 500 ROTC cadets, civilians and active duty personnel from over 50 universities in the U.S. and United Kingdom. ACE provides compelling evidence that rigorous approaches to mission assurance and cybersecurity are feasible and practical at the undergraduate level in engineering and computer science.
As the B.S. degree sets the baseline capabilities for the engineering and computer science profession, it is essential that secure system design and engineering be routine at the B.S. level. I am proud to say that this is the case at Syracuse University.
I end my testimony by pointing out a looming problem we need to address now. We must address the need for trustworthy online and electronic identities. Social security numbers are fatally compromised. They must be replaced.
The thing about online identity is this: our identity is not who we say we are; it is who others say we are. Who are we going to trust with that authority? How will we know that the foundation for establishing identity is trustworthy? How will authorities trusted with certifying identity be audited to verify their trustworthiness? Authentication technology alone is insufficient. It is just one component in a system that requires policies, practices, norms, rules and regulations.
It is worth pointing out what happened to Roger Schell after 1979. Schell would go on to be regarded as the “father” of the National Security Agency’s Trusted Computer Security Criteria. It is the foundation of the current National Institute of Standards and Technology security standards. In 2012, Schell was inducted into the inaugural class of the National Cybersecurity Hall of Fame.
Schell is an example of what individuals can do. Our democracy, with all its well-publicized frustrations, is a workable system that enables engaged citizens to keep debate alive; shape the terrain of expectations, standards, policies and practices; and thereby move all of us to a better place.
Start the discussion and debate now on what minimum standards and expectations are required when it comes to establishing, maintaining and verifying the trustworthiness of systems, corporations and government entities entrusted with our safety, information and our identities in cyberspace.”
Passages from Associate Professor Steve J. Chapin’s testimony to New York State Senate Public Hearing on Cybersecurity:
“My invitation to testify requested threat assessment, information on best practices in the face of cyberattacks, and concrete solutions to cybersecurity. I will address each of these points, but let me say in advance: the future is bleak. The path we are on will only see an increase in attacks and losses unless we make significant changes in how we do business (and by do business, I mean both how we conduct commerce and design, build, and operate cyber-systems).
In many ways, the Equifax breach is just the most recent and spectacular in a long string of security failures that put our citizens’ privacy and fortunes at risk. A list of data breaches just in 2017 includes more than 35 major data breaches in industries ranging from finance, internet services, retailers and telecommunications to health care and higher education. Last year’s Dyn DDOS attack using the Mirai botnet gave us a glimpse into what we can expect in the future if we continue to deploy insecure and in-securable devices in the Internet of Things. Mirai’s descendant, IoT Troop/Reaper, is estimated to have already infected devices on a million networks. In short, there is no natural upper bound to the damage that cyberattacks can do—all of our information, personal and financial, that is on commercial, off-the-shelf computers connected to the Internet, is at risk.
This threat is not confined to e-commerce, but has already put our elections at risk. In 2003, a panel of experts at the IEEE Security & Privacy Symposium described the state of the art in electronic voting machines. They pointed out multiple flaws with the machines being installed in multiple states. In 2017, experts at DefCon broke into state-of-the-art voting machines in < 90 minutes. Some of their attacks were over WiFi and were able to change vote tallies without any trace. Other (white-hat) hackers have demonstrated how they can, with only the aid of a USB memory stick, change vote tallies while in the voting booth. In the words of Calvin and Hobbes, “Live and don’t learn. That’s us.”
When Best Practices Aren’t Good Enough
Twenty years ago, Gene Spafford, one of the luminaries of cybersecurity, wrote: “Secure web servers are the equivalent of heavy armored cars. The problem is, they are being used to transfer rolls of coins and checks written in crayon by people on park benches to merchants doing business in cardboard boxes from beneath highway bridges. Further, the roads are subject to random detours, anyone with a screwdriver can control the traffic lights and there are no police.”
Sadly, that is still a largely accurate description of the state of security on the Internet. It doesn’t matter how well-protected the transport is if the computers at the ends of the transaction are not secure. Having a secure connection to a web server doesn’t help if the database that the server stores customer information in is vulnerable. It doesn’t matter how good the security controls on a system are if they’re not turned on and properly configured.
There is a fundamental lack of accountability for cybersecurity. As a private citizen, I would like to share my personal information with the smallest number of entities—but in modern society, I must share with my bank, my credit card company, my utility, my health care professionals and my employer, to name but a few. I have little insight and no control over with whom they share my information. My only choice is to withdraw completely from society, which is a Hobson’s choice. One factor that sets the Equifax breach apart is that most of the people whose data was stolen never directly consented to have Equifax hold it—that was done by the industries that use Equifax’s services to make credit decisions. When breaches happen, there is significant finger-pointing, but in the end, it’s the public that bears the true cost, through identity and financial theft. One of the breaches I referred to earlier involved a contractor leaving 9,000 documents containing personal information on holders of top secret security clearances on an unsecured Amazon server for six months.
We must move away from systems that conflate identification and authentication. There is nothing wrong with using a Social Security Number as an identifier; there is nothing right about using it for authentication. If I chose to have my SSN tattooed on my forehead it should make no difference—it is not a secret, and never truly has been. Treating it as such has given the illusion of security. Similarly, my birthday is a matter of public record. My mother’s maiden name has been in newspapers—newspapers that are now searchable on the Internet.
I have four recommendations to improve cybersecurity in New York State. Some of these are actions business and industry can take unilaterally; others may require regulatory or governmental support.
- Adopt technologies that enable secure auditing and logging of data. Blockchain (the technology behind Bitcoin) provides the potential for secure, distributed logging of actions. This will enable full auditing of information handling and improve accountability.
- Develop a true citizen-focused form of secure identification. Such a form of ID would allow secure authentication without relying on security through obscurity. This is not to be conflated with REAL ID, which does not provide real, nonforgeable, digital authentication and attestation.
- Require security as a first-class element of system design. This security must be end-to-end, holistic and part of the system from day one. No more cardboard boxes and park benches!
- Trust … but verify. We must stop trusting that systems are designed and implemented properly. Rather, system designers and builders should use formal modeling tools (i.e., math and logic) to prove that their systems perform as advertised and correctly implement authentication and authorization.
I know that it is difficult to define the proper role of government in modern life, particularly in complex technical areas with broad reach. I leave you with another quote from Gene Spafford, which reflects the fact that in 1956, GM advertised styling and performance while Ford emphasized the availability of seat belts. ‘People in general are not interested in paying extra for increased safety. At the beginning, seat belts cost $200 and nobody bought them.’ GM outsold Ford by 190,000 cars in 1956, almost three times the gap from 1955. Sometimes we need a nudge.”