

FFI-RAPPORT 16/00707
83
(production) computer, which are physically interconnected through Ethernet cables. They also
share a common hard disk which utilises an NFS.
41
The act of sabotage described here entails
making the two computers logically connected, creating a connectivity which was purposely
blocked in the initial setup of the IT system. This, however, is an act of sabotage which requires
formidable hacking skills. As mentioned above, breaking through the Front-End security code
is, while no small matter, something many can achieve. Creating a connectivity in the manner
described here, however, is not an easy achievement.
With the current security architecture during the production process, a checksum is created on
the CPU (Central Processing Unit) of the Piql (reception and processing) computer when client
data is received and prepared for writing. This checksum is the same on the CPU of the Piql I/O
(production) computer. During verification, which is done right before the writing process onto
the piqlFilm is started, the production computer’s CPU checks the digital file it has received
through the NFS against the checksum created on the processing computer’s CPU. If a threat
actor has done any alterations to the information before it was stored on the NFS, this
verification process will detect it, as the checksums will no longer be identical.
If, however, the threat actor has managed to create connectivity between the two computers,
they have logical access to both computers’ CPUs. It is then possible for the threat actor to alter
the client data and the corresponding checksum on the processing computer’s CPU, and also go
in and alter the settings on the production computer’s CPU to show the same altered checksum.
The client data is thus no longer safe from attacks on its integrity. This is, on other words, a
potentially disastrous vulnerability, but as it necessitates a threat actor with quite substantial
abilities, the risk is not really that great.
The third weakness in the Piql IT security architecture has to do with cryptography, and it is a
key issue. Today, Piql AS provides no cryptographic protection of the information of the
piqlFilms.
42
A user of the Piql Preservation Service can, however, encrypt the data before
transferring the files to the Piql partner if they wish. Yet, this is then done at their own cost and
risk, and the user is responsible themselves for managing and storing the personal key.
In not offering cryptographic services themselves, Piql AS is offering a weak information
security setup – especially regarding confidentiality – and implying that their clients are free to
do so themselves does not make up for that fact. Indeed, placing any sort of responsibility for –
and functionality of – IT security related to their service outside their own system is ill-advised.
Who is to say what will happen if and when an encryption key is lost? Who will be responsible
for the unsecure preservation of the information?
Of course, there is a reason why Piql AS does not wish to provide cryptographic services as part
of the Piql Preservation Services: the concept of self-containment. Piql AS wants the
information preserved using their service to be self-contained, in keeping with the principle of
41
See figure 5.3 in chapter 5 as a reference.
42
The information regarding cryptography was given during a meeting with Alfredo Trujillo, Product Manager at Piql AS, and Tore
Magne Skar, Project Manager at Piql AS, on 23.11.15.