Live Forensics for Drones

Topic Description

Drones are getting popular, with many possible use cases. A new UK law has been passed to regulate them in March 2018. To enforce such a law, we need a "Security by Design" solution to help experts investigate the incidents involving drones, so that the findings are admissible to the court as evidence.

Considering the problem is practical, and we have a number of drones in the lab, can one think of novel ways to escape from the law enforcement, or the opposite? Such due diligence is needed when you want to have your technical solution stand the test of time.

In the field of aviation safety and security, argumentation based on evidence has been widely adopted. This research project would be quite closely related to the topic we teach in the postgraduate module Digital Forensics (M812) and the pathway to Information Security (M811).

Skills Required:

The most important skills are those of scientific curiosity and engineering rigour. Specific skills are in the technical domains of Internet of Things, e.g. drones or autonomous vehicles; as well as time series and location data analytics. Having background in aviation industry would be desirable.

Background Reading:

[1] Y. Yu, “The aftermath of the missing flight MH370: what can engineers do?” Proceedings of the IEEE, vol. 103, no. 11, pp. 1948–1951, 2015.
[2] M. Yang, Y. Yu, A. K. Bandara, and B. Nuseibeh, “Adaptive sharing for online social networks: A trade-off between privacy risk and social benefit,” in Trust, Security and Privacy in Computing and Communications (TrustCom), 2014 IEEE 13th International Conference on, Sept 2014, pp. 45–52.
[3] M. Barhamgi, A. K. Bandara, Y. Yu, K. Belhajjame, and B. Nuseibeh, “Protecting privacy in the cloud: Current practices, future directions,” IEEE Computer, vol. 49, no. 2, pp. 68–72, 2016.
[4] M. Jackson, “System behaviours and problem frames: Concepts, concerns and the role of formalisms in the development of cyber- physical systems,” in Dependable Software Systems Engineering, 2015, pp. 79–104.
[5] B. Nuseibeh, C. B. Haley, and C. Foster, “Securing the skies: In requirements we trust,” IEEE Computer, vol. 42, no. 9, pp. 64–72, 2009.
[6] B. Haley, C. Laney, D. Moffett, and B. Nuseibeh, “Using trust assumptions with security requirements,” Requir. Eng., vol. 11, no. 2, pp. 138–151, Feb. 2006.
[7] J. Lockerbie, N. A. M. Maiden, J. Engmann, D. Randall, S. Jones, and D. Bush, “Exploring the impact of software requirements on system-wide goals: a method using satisfaction arguments and i* goal modelling,” Requir. Eng., vol. 17, no. 3, pp. 227–254, 2012.
[8] Y. Yu, V. N. L. Franqueira, T. T. Tun, R. Wieringa, and B. Nuseibeh, “Automated analysis of security requirements through risk-based argumentation,” Journal of Systems and Software, vol. 106, pp. 102–116, 2015.
[9] E. Letier, D. Stefan, and E. T. Barr, “Uncertainty, risk, and information value in software requirements and architecture,” in Proceedings of the 36th International Conference on Software Engineering, ser. ICSE 2014, 2014, pp. 883–894.



Report an error on this page