Achtung:

Sie haben Javascript deaktiviert!
Sie haben versucht eine Funktion zu nutzen, die nur mit Javascript möglich ist. Um sämtliche Funktionalitäten unserer Internetseite zu nutzen, aktivieren Sie bitte Javascript in Ihrem Browser.

AG Codes and Kryptographie Bildinformationen anzeigen

AG Codes and Kryptographie

Offene Themen für Arbeiten

Masterarbeiten

Proxy Reencryption for SFC-ABE

Attribute based encryption (ABE) schemes allow to implement efficient and fine grained cryptographic access control to data. In ABE, data is encrypted according to a given access structure. Users possess keys according to their rights or attributes, respectively. A user is able to decrypt a ciphertext and hence to access the corresponding data if and only if his attributes satsify the access structure of the ciphertext.

The decryption of ABE ciphertexts is relatively complex. In order to be able to decrypt ABE ciphertexts on resource constrained devices, Green et. al proposed a proxy reencryption scheme for ABE [1]. Here, a proxy transforms an ABE ciphertext into a conventional ElGamal ciphertext with the help of an ABE transformation key. Then, the ElGamal ciphertext can be efficiently decrypted in the standard way. Even with the transformation key, the proxy obtains no information about the encrypted data.

The thesis deals with a subset of the following aspects of the proxy-reencryption scheme of [1]:

  1. In [1], only an outline of the security proof is  given. The task is  to complete this outline into a rigorous proof of security.
  2. The re-encryption scheme of [1] is based on the ABE encryption scheme of [2]. The task is to generalize the approach of [1] such that it can be applied to other ABE schemes like, e.g., [3].
  3. In [1], the Fujisaki-Okamoto transform is used to achieve CCA security in the random oracle model. The construction of [1] is proof of concept  and uses a one-time pad to encrypt the message. For practical application it is preferred to use the scheme in combination with a symmetric block mode in a hybrid encryption scheme. The task is to adapt the construction of [1] accordingly.
  4. The construction of [1] assures confidentiality of the message also against malicious adversaries. Authenticity is not guaranteed with malicious proxies but [1] outlines an extension to provide also authenticity. The task is to formally specify the extended scheme and proof its security.
  5. The research group Codes&Cryptography develops a Java framework for pairing based cryptography. The task is to extend this framework and include the scheme of [1], preferably with the extension of Task2, Task3, and Task4.
  6. It is planned to use the approach of [1] in the collaborative research project “Securing the financial cloud (SFC)”. The task is to apply the scheme of [1] to an SFC use-case.

The focus of the thesis will be selected according to the background of the student.

The thesis will be supervised by Paderborn University, research group Codes & Cryptography and the Wincor Nixdorf International GmbH.

Language: English or German.

Literature

  • [1] Matthew Green, Susan Hohenberger, and Brent Waters. "Outsourcing the Decryption of ABE Ciphertexts." USENIX Security Symposium. Vol. 2011. No. 3. 2011.
  • [2]: Brent Waters. "Ciphertext-policy attribute-based encryption: An expressive, efficient, and provably secure realization." Public Key Cryptography–PKC 2011. Springer Berlin Heidelberg, 2011. pp. 53-70.
  • [3]: Yannis Rouselakis, and Brent Waters. "Practical constructions and new proof methods for large universe attribute-based encryption." Proceedings of the 2013 ACM SIGSAC conference on Computer & communications security. ACM, 2013.
Implementation and optimization of elliptic curve math in Java

Elliptic curve groups are an integral part of modern cryptography. Modern constructions are usually based on elliptic curves that allow a pairing, which is a bilinear map between points on the curve.

As such, our group has implemented a math library that allows the use of elliptic curve groups and pairings. It is available on github, together with a large number of schemes built upon the math library. The current implementation of elliptic curves is functionally complete, but computations a lot of time (relatively). There is much space for optimizations (which the library is already prepared to implement easily). For example, such optimizations are:

The topic of the thesis is to understand, explain, and implement some of these optimizations. It is a highly mathematical topic (as the optimizations are all rooted in algebraic tricks) and fun to work at. You should be comfortable with or ready to learn about finite field extensions and elliptic curve groups. A background in algebra is helpful.

Improving features of anonymous communication

Many cryptographic systems, such as group signatures, have built-in privacy. For group signatures, this means that no polynomial-time adversary can, given a signature, find out who signed it (he only learns that some member of the group signed the message). In practice, the great measure of anonymity is diminished when used over the internet: if A sends an anonymous signature to B, then the signature itself does not reveal A's identity, but the IP address within the TCP packet does.

Using an anonymous communication system, a set of parties can communicate anonymously, i.e. the receiver of a message does not learn who sent it and the sender of a message only knows the receiver under some pseudonym. Today this is usually done using TOR. Recent research culminated in a new system that is based on trusted execution environments (such as Intel's SGX) and offers a much higher degree of security than TOR and other such schemes.

We suggest several possible extensions to this system:

  • Revocation of pseudonyms (using, for example, Bloom filters)
  • Efficiency improvements using network coding
  • Allow nodes to dynamically join or leave the system
  • ...

The thesis should deal with (some of) these extensions. Optionally, these ideas can be implemented into our existing prototype implementation (using the SGX SDK, language is C++).
There are initial ideas how to realize them, but there is lots of space for new ideas. The original construction uses ideas from both cryptography and overlay networks and is a nice application of both.

Bachelorarbeiten

Communication-optimal accumulators based on bilinear maps

A cryptographic accumulator is a constant-size representation of a (large) set S. For every element in S, there is a short witness such that anyone given the accumulator and the witness can verify that indeed, the element is included in the original set S. For every element not in S, finding such witnesses is computationally intractable (under the usual cryptographic assumptions). 

A typical application for accumulators is revocation (e.g., of access rights). When someone gets certain access rights, their ID is included in some public accumulator. To get access, one simply computes a witness for his ID in the accumulator and supplies that to the verifier. If at some point, access needs to be revoked, the responsible party can simply remove the ID from the accumulator, preventing the party from gaining access in the future.

Usually, whenever the accumulator changes to a new set S', every owner of an accumulated ID in S' needs to update his witness so that it fits the accumulator for S' instead of S. A desirable trait of accumulators is being able to update witnesses incrementally instead of having to recompute witnesses for new sets from scratch. Baldimtsi et al. recently published an accumulator with the unique property that if S' is a superset of S (i.e. only additions occured), no witness updates are necessary at all. They base their construction on the strong RSA assumption. 

However, in many contexts RSA-based accumulators are inconvenient. Many constructions that would want to use accumulators for revocation are based on prime-order groups with bilinear maps, which makes them incompatible with the published accumulator.

The rough idea behind this thesis topic is to first understand the construction of Baldimtsi et al. and to provide detailed proofs for their claims (they only sketch proof ideas in their paper). Then, you should transfer their ideas to the prime-order group setting, making them applicable for a large range of modern cryptographic constructions. For this, the accumulator by Nguyen serves as a base construction for an accumulator that is extendable with methods similar to Baldimtsi's. 

Experimentelle Untersuchung des SEM Algorithmus

Eine typische Aufgabe in der Clusteranalyse ist, Daten durch Wahrscheinlichkeitsverteilungen zu beschreiben. Dabei sind Daten gegeben durch eine D-dimensionale Punktemenge X und es soll eine Dichtefunktion p aus einer bestimmten Familie von Dichtefunktionen gefunden werden, die die Punktemenge am besten beschreibt. Wie gut die Verteilung p die Menge X beschreibt, lässt sich anhand des Produktes der Dichten der einzelnen Punkte messen, d.h. \prod_x\in X p(x). Dieses Problem wird auch das Maximum-Likelihood Estimation (MLE) Problem genannt.

Um dieses Problem zu lösen, gibt es den Expectation Maximization (EM) Algorithmus [1]. Dieser bekommt als Eingabe eine initiale Dichtefunktion, die er dann Schritt für Schritt versucht zu verbessern. Der sogenannte Stochastic EM (SEM) Algorithmus ist eine randomisierte Variante des EM Algorithmus, der einige Berechnungen im (deterministischen) EM Algorithmus durch ein Zufallsexperiment ersetzt, das die Berechnungen imitieren soll. Der Vorteil dabei ist, dass sich die Laufzeit gegenüber dem EM Algorithmus verbessert. Die Frage ist, wie gut der SEM den EM Algorithmus tatsächlich imitiert.

Eine typische Familie von Verteilungen, für die das MLE Problem betrachtet wird, sind Mixturen von Gaußverteilungen. Wie gut der SEM den EM Algorithmus in diesem Fall imitiert, wurde bereits in [2] (experimentell und theoretisch) untersucht. Im Rahmen dieser Bachelorarbeit sollen der SEM und EM Algorithmus für weitere, einfache Verteilungen (z.B. die Exponentialverteilung) verglichen werden.

[1] Christopher M. Bishop: Pattern Recognition and Machine Learning (Information Science and Statistics).

[2] Johannes Blömer, Kathrin Bujna, Daniel Kuntze: A Theoretical and Experimental Comparison of the EM and SEM Algorithm. In Proceedings of the 22nd International Conference on Pattern Recognition (ICPR) 2014, Stockholm, Sweden, pp. 1419-1424, IEEE, 2014.

Efficient finite-field arithmetic in Java

Most people are familiar with the field Zp, the integers modulo a prime number p. This field is the basis for crucial mathematical constructs, such as extension fields and, in turn, elliptic curve groups. For this reason, any improvement to its implementation can speed up cryptographic systems considerably.

While Zp is structurally very simple, there are neat tricks to improve efficiency of its operations. For example, Montgomery multiplication aims at getting rid of the expensive modular reduction step that needs to be done after each operation and replacing it with cheaper operations.

The goal of this thesis is to implement arithmetic on Zp efficiently, using optimizations such as Montgomery multiplication (and possibly many others). If all goes well, your work would become part of our open-source math library and speed up our advanced constructions. You can also use existing code to benchmark and to compare against our current "naive" solution that is based on Java's BigInteger.

Sigma protocols for anonymous credentials

In a recent project group, we implemented an anonymous credential system. In such a system, users are given credentials that they can use to authenticate anonymously (i.e. the verifier learns only that some user with access rights is talking to it, but does not learn which one).

Such credentials are associated with attributes (such as age, country, etc.) over which statements (such as "age > 18", "country = 'Germany'") are proven in a privacy-preserving manner. Such a proof is enabled by Sigma protocols (usually based on Schnorr's protocol, which you may know from our lectures). The project group implemented several Sigma protocols (one, for example, that does an "age > 18" proof) and ways to compose them.

The goal of this thesis is to research and implement new Sigma protocols that enable different statements, or are more efficient than the ones we currently use. One example for such an improvement is batch verification of Sigma protocols. During the work on the thesis, you (with our help) will identify more possible improvements. If everything goes well, your work will become part of our open-source anonymous credential system and provide performance improvements and/or additional features.

On the security of identity-based encryption in the random oracle model

Some of the more advanced encryption schemes are identity-based encryption schemes (IBE). An IBE scheme is a special public-key encryption scheme in which the encryption algorithm takes the receiver's identity (e.g., his email address) as input instead of the receiver's public key. This means that essentially there is no need for public keys anymore and so the sender of a message does not have to look up the receiver's public keys in some public key infrastructure.

In our upb.crypto.craco library, we have implemented an IBE scheme based on Waters construction. As it turns out, encrypting in this scheme is somewhat costly. Because of this, we replaced the special algebraic hash function used within the Waters construction with a more standard hash (e.g., SHA-2), which is more efficient. We are (very) confident that this results in a secure scheme, but do not have a formal proof. 

In this thesis, you should understand and explain the Waters IBE scheme and prove that the altered scheme is secure (in the random oracle model, an idealized model for hash functions). Optionally, you should apply your insights to an attribute-based encryption scheme (a powerful generalization of IBE), where the same hash function is used and has been replaced in the same manner in our implementation. 

Die Universität der Informationsgesellschaft