Research and Publications
I used to work as a research assistant in the research group for network security, information security, and data protection at the Frankfurt University of Applied Sciences.
Publications
Access, Abstract, and Citation
Access: download paper, source code
Abstract:
In the past, stack smashing attacks and buffer overflows were some of the most insidious data-dependent bugs leading to malicious code execution or other unwanted behavior in the targeted application. Since reliable mitigations such as fuzzing or static code analysis are readily available, attackers have shifted towards heap-based exploitation techniques. Therefore, robust methods are required which ensure application security even in the presence of such intrusions, but existing mitigations are not yet adequate in terms of convenience, reliability, and performance overhead.
We present a novel method to prevent heap corruption at runtime: by maintaining a copy of heap metadata in a shadow-heap and verifying the heap integrity upon each call to the underlying allocator we can detect most heap metadata manipulation techniques. The results demonstrate that Shadow-Heap is a practical mitigation approach, that our prototypical implementation only requires reasonable overhead due to a user-configurable performance–security tradeoff, and that existing programs can be protected without recompilation.
BibTex:
@inproceedings{Bouche2020ShadowHeapValidation,
title = {Shadow-Heap: Preventing Heap-based Memory Corruptions by Metadata Validation},
authors = {Bouché, Johannes and Atkinson, Lukas and Kappes, Martin},
booktitle = {European Interdisciplinary Cybersecurity Conference (EICC 2020)},
year = {2020},
doi = {10.1145/3424954.3424956}
}
Access, Abstract, and Citation
Access: download paper, source code
Abstract:
In this paper, we present a Hybrid Bayesian–Evolutionary tuning algorithm (HBEtune) for tuning machine learning algorithms or evolutionary algorithms, and analyze its performance. HBEtune combines meta-EA and Bayesian optimization techniques.
As hyperparameter tuning is a noisy, black-box optimization problem with expensive target functions, practical tuners must aim to minimize the number of necessary samples. In our method, we guide the EA's recombination operator towards more promising samples by employing the expected improvement acquisition criterion commonly used in Bayesian optimization. The expected improvement is evaluated on a surrogate model using a Gaussian process regression.
HBEtune shows generally competitive performance when compared with the state of the art irace tuner. Performance is analyzed across a suite of synthetic and real-world benchmark problems.
BibTex:
@inproceedings{Atkinson2020HybridBayesianEvolutionary,
title = {Hybrid Bayesian Evolutionary Optimization for Hyperparameter Tuning},
authors = {Atkinson, Lukas and Müller-Bady, Robin and Kappes, Martin},
booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2020)},
year = {2020},
pages = {225-226},
doi = {10.1145/3377929.3389952}
}
Access, Abstract, and Citation
Access: download paper, source code
Abstract:
An important challenge in designing evolutionary search heuristics is the statistically significant evaluation of different configurations. The goal is to find an optimal algorithm design with respect to its parameters, ie, parameter tuning. In this paper, we propose an open source software framework, called Multijob, allowing to simplify and automate EA configuration and parameter tuning. Additionally, the framework offers a workflow for distributed execution of the preconfigured algorithms in heterogeneous computing clusters or grids.
BibTex:
@inproceedings{Mueller2017MultijobEfficientDistribution,
title = {Multijob: a framework for efficient distribution of evolutionary algorithms for parameter tuning},
authors = {Mueller-Bady, Robin and Kappes, Martin and Atkinson, Lukas and Medina-Bulo, Inmaculada},
booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference Companion (GECCO '17)},
year = {2017},
pages = {1231--1238},
doi = {10.1145/3067695.3082476}
}