CS491cm2
William Lee

Establishing the Genuinity of Remote Computer Systems by Rick Kennell & Leah H. Jamieson

Summary:

Kennell and Jamieson propose a method to verify the genuinity of computer systems without human intervention. For example, in a networking environment where you need to authenticate many hosts, one may not want to visit each system and verify their authenticity. Rather, they suggested that the genuinity of the system can be verify through checking whether the targeted system’s software and hardware.

The authors claim that they can verify the system’s software by They verify the system software by performing a checksum routine on the OS kernel’s memory space. They assume that one can very the hardware architecture by checking the TLB hit/miss ratio. Then, in combination of the two, they calculate the checksum of the software by summing the bits and the TLB miss count in a pseudorandom memory traversal.

Many different potential attacks are identified. The most apparent one is the simulator attacks. Essentially attackers can simulate the machine architecture and calculate the checksum like the genuine system. The authors used a timing a timing approach by specifying the maximum running time set by the best simulator result that they get. They also try to construct their tests using low level kernel memory space for it is much more difficult for a simulator to properly imitate the CPU state without modifying the TLB miss/hit ratio.

Questions:
1. Do you think the approach of verifying the genuinity of a system through hardware and software work in general?
2. What kind of assumptions does this approach make on the system software/hardware/network bandwidth?
3. Is the running time for computation of various checksums reliable? Why?
4. How well does this system stand up against distributed simulator attacks?
5. How dependent this system is to the advancement of simulators?
6. It seems like the authentication is one way. What problems can occur when the remote entity runs malicious code for a fake authority during the challenge process?
7. The author claims there does not need to be any human intervention when authenticating unknown system. However, what does the authority need to know in order to perform the checks?
8. How well does the system work on heterogeneous systems, where entities keep different software and hardware?
9. What kind of information is necessary for one system to verify that the other is genuine?
10. What are some ways to enhance the check for genuinity for computer systems?


Pros

1. Attempts to identify the hardware instead of manual verification
2. Can increase robustness for their checks by adding more hardware checks
3. Can work on consumer-grade hardware with no specialized device
4. Pseudorandom way to traverse the TLB in order to increase the difficulty for simulators
5. Identifies possible break-ins, which is good.
6. May relate to digital copyrights research

Cons

1. Length (too long) and hard to read
2. Many possible attacks, such as man-in-middle and malicious authority, were not described
3. Checksums take very long to compute (up to several minutes)
4. Network delays may throw off result
5. More processing power may break this

Vote

Strong Accept: 0
Accept: 0
Reject: 17
Strong Reject: 2

Thanks,

Will

--
William (Will) Lee
Email: wwlee1@uiuc.edu
Computer Science, University of Illinois at Urbana-Champaign