Coding Projects

Web Scraping

Job portal bot for document extraction

This project is an example for a Selenium-based web bot, which is used to automatically navigate through a job portal that discloses postings only in form of PDFs and where PDF links are not embedded in the source code but connected to JavaScript.

The crawler is part of an academic project to monitor job market developments during the COVID-19 pandemic. 


See repository on GitHub

Web scraper for JavaScript-heavy websites

This project is an example for a scraper used to encompass JavaScript elements on a webpage using Selenium web driver and printing non-machine-readable text as PDF to a local directory.

The scraper is part of an academic project to monitor job market developments during the COVID-19 pandemic.



See repository on GitHub

Web scraper for hidden web elements

This project is an example for a requests-based scraper that uses Selenium to encompass a "click for more" button to reveal information and to obtain all URLs that can then be scraped via a simple BeautifulSoup parser. The scraper is part of an academic project to monitor job market developments during the COVID-19 pandemic. 


See repository on GitHub

Web scraper for annual report extraction

This project is an example for a requests-based scraper that can be used to download large amounts of annual reports in an automated way. To choose the companies of interest it uses an input list with company names in Excel format.

The scraper is part of an academic project that applies textual analysis to the content of annual reports of S&P500 companies. 


See repository on GitHub

Text Analysis

Text analysis tool for job portal data

This project provides a toolkit to conduct textual analysis of skills on job descriptions in job portal data. The tool is dictionary-based, i.e., it searches for pre-defined key words and regular expressions in text data, upon normalizing it. All input files are in Excel format in order to facilitate usage also for non-programmers. The tool is part of my World Bank policy analysis (joint with Alicia Marguerie and Stefanie Brodmann) on how to use job platform data to understand skills demand in development contexts. You can find the paper here.


See repository on GitHub

Web Development

Web application "Korbfueller" for grocery shopping optimization

"Korbfueller" is web application which I developed during my doctoral studies.  The backend is programmed using Python/Flask and the frontend is designed in HTML, CSS and Java. The purpose of the app is to simpify your supermarket shopping for basic groceries in the presence of financial and mental constraints. Using web scraped supermarket data, the program helps you to find the cheapest products in the market of your choice and at the same time distribute your money as balanced as possible between different product categories. A prototype is available here (password and user name available on request). You can find the academic paper explaining the algorithm here.


Disclaimer: This is my private web page and the views expressed in the material posted here are my own and do not reflect those of the European Central Bank or of the Eurosystem.