- CISO for a company in Germany - likes drawing on iPad - interested in too many topics - (chaos) magician and dimensional traveler - thinks that "Senior Nerd" should be a job title #100days

Art and how it affects my life

After divorcing my ex-wife, I met my current girlfriend. Her drive in life is largely determined by art. Whenever she comes in contact with any kind of art, she explodes in a ball of energy that inevitably carries you along. And I love it to get infected by this energy, too. Sometimes we sit together in a room, doing our own stuff... she's drawing, I'm working. But sometimes I put my work aside, pick up my iPad, start drawing and then it happens... I swim through the waves of creative energy along with her.

It's such an impressive feeling, when the world around me slowly fades away and only me and my drawing still seem to exist. And after some hours I "wake up" from that trance, looking at the screen and think: "Wow, I created that?" This feeling is so unique and yet it also brings back memories of my childhood every time. Because as a child I loved to draw. However, in the course of my career in the IT industry, I stopped creating fine art. It's not that I had no connection to art anymore. I discovered art in software design, and the unfussy beauty of mathematics always excited me. And yet it's something else to paint pictures.

Interestingly, I have also reconnected with my children through art on a deeper level. They enjoy it when they can be with me and my girlfriend and paint pictures. And I enjoy it, when I see their colorful drawings, which give me a deeper insight into their world. 

It's so impressive to see how much magic is in their view of the world. And that, in turn, makes me rediscover the magic in my world. For a long time I was looking for the magic in myself. I worked with different magical systems, like Enochian Magic, the system of the Order of the Golden Dawn, with Chaos Magick, Tantra, Ice Magick and many more. But it took art as a key to realize that magic is actually all around me. And with this discovery, I also found many things again that I had already lost in my childhood. I rediscovered the beauty of nature and discovered natural mathematical logic. I learned (again) not only to see people, but also to feel them. And I also realized that it is my connection with this world that makes the real magic possible. Suddenly, many things from Chaos Magic and what is commonly called Tantra (I don't like this term because it was and is quite abused by the New Age movement) finally made sense.

And suddenly I'm living in a whole new world. My relationship works, not always without conflict, but always with the possibility of solving these conflicts together. With my children I have a connection like I never had before, although I can no longer live with them (but I see them every day). I enjoy my job again. And many things that I used to see as a problem, don’t affect me that much any longer.

I can therefore only advise everyone to open up to art when they find themselves in a situation where there seems to be no way forward and no way back. Art can be a key to look at the world from new angles and find new ways.

IaC - why you should(n't) use it

Yes, I hate IaC (Infrastructure as Code)... and I love it... sometimes.

Of course there are a lot of advantages of IaC. It makes infrastructure reproducible (partially), auditable (partially) and by that... easier to control (partially). But you should ever take a closer look if it's really useful for your company. In fact it's not useful for more or less static infrastructures. If you don't use a server network with more than 100 servers or if you don't use a constantly changing server network, IaC is for sure not for you. If you run only some dedicated servers in a data center for your website and email and perhaps and OwnCloud or similar, IaC is definitely not for you! Why?

There is no tool, that fits your needs

First of all you'll never find a tool that fits all your needs. In the end you'll use a bunch of tools, for example Terraform to provision your virtual machines, Ansible to deploy and update your software, you build VM images with Packer and if you work with cloud environments like Kubernetes you'll also use tools like KOPS and of course Dockerfiles. In the end you have to manage more software than you had to manage before.

More Version Conflicts

"But software management with IaC is much easier!" No, it's not! Because there is another problem: version conflicts. Never expect that the new version of a tool is downward compatible to the version you use. It's more realistic to expect broken state files if you update your tool and any of your team members is still using the old version of this tool. And it's also more realistic to expect, that you have to rewrite some if your IaC code after updates, because of incompatible parameters and similar. So you have to manage more software than you had before. Additional to your tech stack you now have to manage your IaC tools. Congratulations! Perhaps you had a tech stack, which consisted of Webservers, DB servers, Caching layer, Load Balancers and perhaps some security-related tools and a CI/CD suite (in best case everything in containers), and now you have Terraform, Ansible, KOPS and Packer, which also will cause version conflicts. What an advantage! ;)


One of the biggest pro's of IaC should be a better collaboration. This may be correct, if you implement very strict guidelines about how to use the IaC suite in your infrastructure team. If you don't, you'll end up with a bunch of cruft code, non-reusable "modules" and isles of knowledge, where some of your team members understand only parts of your IaC infrastructure.

If you decide to use IaC, never forget, that other departments or external partners of your company may collaborate with your infrastructure team. And the other departments are presumably not involved in update management or your external partners may never have worked with IaC tools before. Congratulations! You now have some additional problems in your company. Your infrastructure team must integrate the environments, which were build by external partners into the IaC infrastructure and your developers will have only a partially understanding of your infrastructure, because your sysadmins think, that IaC is enough documentation.


Never forget, that IaC consumes a lot of time. It consumes time not only while setting up the infrastructure. It also consumes time when you change anything to your infrastructure. You need a little change to your infrastructure, like a new VM instance? Ok, what would be the "classical" way?

You login to your cloud/hosting provider and start a new server instance.

You install your software on this new instance.

Perhaps you add it to your Load Balancers.


What is Terraform doing?

You write your new infrastructure definition.

It checks if the syntax of your IaC code is correct. -> You fix your code.

It checks your state file. -> Hopefully, there are no broken state files, else your sysadmins spend the next hours to fix it.

It checks your infrastructure if everything is compliant to the expected state.

It tells your sysadmin what it will do in this run.

Your sysadmin has to confirm it. But in most cases he will check why some of the changes are needed and he must coordinate the changes in your cross-team environment.

It will do all the changes. You can not simply skip some of them without changing your IaC code.

In the end you'll need an hour for a process that would typically consume 3 minutes, because your sysadmins have to change IaC code or coordinate unexpected or unwanted updates.

It becomes more time consuming, if you decide to integrate IaC in your current infrastructure. The typical way would be to setup a completely new environment, that is completely managed with IaC. I did this process 3 times now. It ever bound all resources of the infrastructure teams for several weeks. Please don't expect, that your current infrastructure is still managed by your sysadmins. They have enough to do with your new IaC-based infrastructure and no time for "the old stuff". So you have to expect, that your old infrastructure is not updated until you move to your new "modern" IaC-based infrastructure, except you add some additional sysadmins to your team.

The first time I did an IaC integration was in a time, when tools like Ansible or Terraform were not available. So we wrote our tools be ourselves... with Perl. We invested most of our free time in coding but in the end we had a tool, that we called "ASP Tool" (Application Service Providing Tool). It was perfect, because it was designed for our very specific infrastructure, consisting of classical webserver environments (LAMP stacks), some in-house developed search engines and some very project-specific software. And it was perfectly integrated in our CI/CD environment. Furthermore it did only the changes, which were defined in the state files. Only an additional parameter triggered a check if our infrastructure was compliant to the state files. So we could do changes nearly as fast as we would have done them manually. This is not possible with new tools like Terraform, Ansible, Puppet or Chef, because they will ever check the state of your complete network. Sure, you can split your infrastructure into multiple repositories, but in that way you'll end up with a lot of repositories, where (hopefully) only your security department will have an overview.

If you use "modern" tools like Ansible, Terraform or Chef, they are never designed for your network. They must meet the requirements of a lot of different environments. This is a nearly impossible balancing act and the reason why you'll end up with a bunch of tools.


Another advantage of IaC should be a better auditability of your setup. This is absolutely correct... from the view of your infrastructure team. But have you ever asked your developers if they understand the infrastructure if they only have the IaC code available? Have you ever asked your security department if IaC helps them to see if all of your security requirements are met?

I can say from my perspective, that at least your security department will clearly answer with a "No!". Most of the IaC tools in the market don't track all manual changes. You load a kernel module that is not defined in your IaC? IaC will ignore it, because IaC don't track it. You change a configuration that is not defined in your IaC, because your sysadmins used the default settings? Your IaC tools will not see it. Why? Because this tools only track what you tell them to track. If you don't import all of your configurations and expected system states into your IaC environment, IaC can not help you with auditability. In the end your security department will use a semi-intelligent intrusion detection system to audit the systems. Congratulations! Not IaC helped you to audit your systems, but your IDS does. Oh wait... you could install this IDS also without IaC and the setup would only consume half of the time.

When IaC really helps

Of course there are reasons to use IaC. But you should inspect your environments and your requirements before you decide to use it. If you answer most of the following questions with "No", you shouldn't use IaC:

Do you often change your tech stack?

Is your environment highly scalable?

Do you regularly start completely new environments?

Do you use auto-scaling environments or plan to use them? (If yes, also ask, if you really need auto-scaling and how much can it save.)

In fact I've seen only a few environments within 20 years in my job, which really needed IaC. One was in an agency, that had to set up new environments for new customers every few days. Another one was in a server network consisting of several hundred of servers.

If you had a more or less static environment in the past, where you only add some additional servers every view weeks, you don't need IaC. Your sysadmins will be much faster in setting up new servers, if they don't have to use IaC for it. If you work in an environment based on Docker/Kubernetes/Cloud, you can scale your environment with some simple changes to manifest files and you can add additional nodes to your cluster with basically a single command on the new node. If you use auto-scaling groups on AWS but your tech-stack is not constantly changing, you don't need IaC.

IaC also (partially) helps, if you must prepare your network to move to another hosting provider. If your hosting provider's datacenter is destroyed and you must move to another provider within some hours, IaC is a big advantage. It mostly abstracts the API layers of the providers and by that makes it possible to setup your infrastructure from scratch very fast... as long as your IaC code is prepared for it and you stored backups outside of your current provider. In fact most if IaC infrastructures are not prepared for such use-cases and by that they are mostly useless in such situations.

You need IaC if your tech-stack is very flexible, for example if your developers play around with new technologies every view days or weeks. You need IaC if you add additional servers every day or week. You need IaC if you have a very big server network, i.e. >100 servers. You need IaC if your infrastructure team consists of >5-10 employees (presumed, that you'll also implement guidelines on how to use IaC). In all other cases you don't need IaC. IaC is only another hype, but in the end it's only needed for very flexible or very big environments. And this doesn't apply to most of the small- and mid-size IT companies.

What you should consider

If you decide to use IaC you should consider some points:

  1. Set very strict guidelines on how to use IaC. Especially the reusability of modules is a big pain point in most of the companies. If a module is not reusable it's not a module! And it must be tested if a module is really reusable!
  2. Provide additional documentation. Even if your sysadmin thinks that the code of your IaC is enough documentation, ask for data flow diagrams, documentation of the repository content etc.. It will save a lot of time for your developers!
  3. Track and calculate the time, that is needed by your sysadmins for IaC. If they need longer than with a manual setup of the servers, stop it immediately!
  4. Ask other departments if they still understand your environment. IaC is an additional abstraction layer, that may be confusing for other employees, as long as they don't have additional documentation (see 2.).
  5. Ask your external partners, if they can work with the tools you use. Else your sysadmins will spend a lot of time to integrate the setups of your external partners into your IaC.
  6. 6. Do a cost calculation. If the time saved by IaC doesn't significantly exceed the time your sysadmins normally need to set up new servers, it's not worth to use it.


If you really decide to use IaC, you should do some simple calculations and think about some points.

How much time will it need to integrate my current infrastructure into IaC?

Which tools has my infrastructure team to maintain additionally to my tech stack?

What are the costs for the time your sysadmins need to integrate your infrastructure into IaC and how much time to they really save with this step?

Is my server environment really highly flexible and/or scalable to justify the high costs for the setup?

Who will manage the current environment until the switch to the new environment can be done?

And don't forget to evaluate the tools before you use them. The biggest chaos arises, when you use tools which doesn't fit your requirements. "This tools is cool" from the mouth of your sysadmins is not evaluation!

The 4 Levels of IT Security

In principle, protection of IT systems can be separated into 4 levels. These are prevention, detection, assessment and response. In the proper combination, they can secure the IT platform of any company as much as it is possible to do.


The area of prevention is probably the most comprehensive area for IT system security. It can be fundamentally separated into data protection and system protection. Data protection includes things like data encryption, transport encryption, backups and even access protection to IT systems. System protection, on the other hand, involves things like hardening systems or patch management. Unfortunately, too many companies still focus exclusively on the area of prevention when it comes to securing their IT systems. This then makes forensic work more difficult if a system does get breached. Because...


It's no secret between hackers that there is no such thing as 100% security for IT systems. A simple bug can already provide an attack vector. And in most cases it takes at least a few hours until a suitable patch is available. In addition, there are exploits that are only passed on by hackers under the table, so that it sometimes takes days or even weeks for the gap to become known. Remember the Exchange bug some months ago? It was known to intelligence agencies long time before. So it's necessary that anomalies in an IT system are detected.

This is where detection comes into play. It includes firewall systems that validate traffic, as well as intrusion detection systems and prevention systems that detect anomalies in the systems themselves. Furthermore, modern intrusion detection systems are also capable of analyzing statistical data from the systems and using this to detect unusual processes. Good defense systems can also block typical attacks (like bruteforces for example) and thus already prevent worse.


However, even the best intrusion detection systems and firewalls can falsely identify processes as attacks. They are already quite good at automatic assessment and are also getting better and better, thanks to artificial intelligence. Nevertheless, control by a human should also take place. The evaluation should therefore never be left to the automated systems. Automated assessment should only ever be a part of the assessment process, using alerting to draw attention to the fact that an unusual process has been detected in the system. A human review is always required.


Depending on how the assessment then turns out, a reaction is, of course, necessary. If an attack is detected that is still in progress, appropriate defensive measures should be taken. If a system has already been successfully compromised, a forensic investigation must be carried out to find out how the attack took place and what data was manipulated or stolen. Furthermore, a report must be made to the Data Protection Official and, if necessary, to the responsible authorities. It is usually not a bad idea to inform the customers, because experience shows us that data breaches are always published somehow. It's better if the company keeps control over the publishing.

If you implement this 4-level-model in all IT systems and your risk management and have appropriately qualified employees, you can at least assume that attacks will not go unnoticed and that in many cases they can be averted in good time. In today's world, this is vital for companies. After all, in addition to high fines and possible lawsuits, the loss of image is often difficult or impossible to repair.

Why we don't use Getstream - or why more privacy means more problems

While developing our mobile app, we also search for a service provider to provide text chat to our users. The solution from Getstream.io looked very nice and a test implementation showed, that it worked perfectly for our needs. As usual I searched for a Data Processing Agreement (sometimes also called Data Processing Addendum, especially if it's based on the so-called Standard Clauses) because Art. 28 GDPR requires it if private / sensible data is processed by a third-party provider ((also called processor) on behalf of the data controller. 

I found a good documentation of their security on their website. But I couldn't find a pre-signed data processing agreement or similar as it is provided by most other companies and even by Google and Microsoft. So I opened a ticket and asked for a contract because as a Germany-based company we have some special requirements for such agreements.

The GDPR defines in Art. 28 par. 9. that the contract "shall be in writing, including in electronic form". Ok, it's on their homepage and it's in electronic form. But § 128a of the German Civil Code defines the requirements for an electronic form, because this is not defined by the EU, but is the responsibility of the individual EU members. And there is clearly stated "If the legally required written form is to be replaced by electronic form, the issuer of the declaration must add his name to it and provide the electronic document with a qualified electronic signature.". A text on a website doesn't comply to this requirement.

The answer to a request to the Governmental Data Privacy Official of Bavaria also said, that at least a fixed format and an evidence that both sides (the processor and the controller) have agreed to it is required. So what companies in Germany basically need is at least a write-protected PDF file and an email where the third-party provider writes that this his Data Processing Agreement. 

Unfortunately the support of Getstream answered to my request, that they don't provide a DPA if we aren't customers of their Enterprise Plan. Okaaaay... we, a small startup from Germany must have >100k users before they give us a DPA? Really? I think it's clear that we will not continue with their service. The risk of transferring data of our users to a third-party without the legal basis is no risk that we're willing to take, neither I nor our CEO. Is it really that hard to create a PDF from their website and send it to us via email? 

But it was the only third-party provider in my whole career who bound a DPA to a specific plan, user level or similar. Even small startups from USA send us a DPA if we explain them the legal requirements and point out the laws we have to comply to. Most of them even sign it. On the other hand it also shows what problems the strong privacy laws bring to companies in Europe.

My privacy toolkit

As someone who always has an eye on protecting private data, not only for our company's customers, partners and employees but also for myself, I've looked at some tools over the years and some of them I made part of my daily workflows. But before I talk about some of them, I want to make clear some points.

First of all, I'm an Apple user. And I am by conviction. I'm not an Apple fanboy who needs always the newest iPhone, iPad, Watch and Mac and sleeps in front of an Apple store to be the first one who can by the newest models. But I work with macOS (and former OSX) for around 10 years now without ever regretting the switch from Linux. Of course my smartphone is an iPhone and my tablet is an iPad Pro. To be honest, I never had so few problems on a Linux machine or an Android device as I have on my Apple devices. Of course, there was never a problem on Linux that I couldn't solve by myself, but if I look back to that times when I used Linux as a desktop system, I see a lot of lost hours that I spent with fixing errors and unwanted behaviors. Since I'm using Macs I never faced similar problems again. And therefore I'm a convinced Apple user. In general macOS is a BSD-like system and I like how smooth the different Apple devices work together. However, for this reason the software I’ll talk about here will be mostly for Apple devices. But some of them are also available on other systems and platforms.

Secondly, I would like to note that my software recommendations are purely subjective. I don't claim that this is actually the best software available for specific tasks. There may be better ones, but the ones I'm going to write about here I just particularly like. Therefore comments like "But XYZ is much better than ABC, because ..." are completely useless. If you want to recommend some software to me, give me facts to compare, not opinions.

And last but not least, I'm fully aware of that I'm using paid software where I could use OSS alternatives. But there are reasons why I prefer the commercial software to open source software. In my experience, software that you have to pay for is mostly better than available OSS alternatives, at least on macOS. 

But now... let's begin...

Email Encryption

I couldn't live without email encryption. Unfortunately this is a task that most Linux machines can do better with free / open-source software than a Mac can do. But there is a solution, that is called GPG Suite. It's not free but worth every cent. It adds GPG / PGP encryption to the Apple Mail application and as soon as you created (or imported) a key for your email address(es), you can use encryption and email signing with a simple click. Your keys and the public keys of your contacts can be easily managed in a keychain-like interface, the GPG Keychain. Also importing keys for specific recipients is easy with the keyserver search that is integrated in the GPG Keychain tool. In addition I'm also using an email provider who provides GPG / PGP even in their webmailer. If you ever thought that email encryption is complicated, try Apple Mail with GPG Suite. You only need to understand: The public key is used to encrypt a message, the private key is used to decrypt it. That's basically all you need to know. Of course, this also results in the fact that you never give your private key to another person, because only you should be able to decrypt a message that was encrypted by somebody with your public key.

Taking Notes

Before I found Standard Notes, I used Evernote. And whenever I wanted to secure some data / notes, for example serials, I encrypted it with the GPG CLI tools before I added them to a note. I simply couldn't trust the builtin function for encrypting notes and I wanted to make sure that Evernote wasn't able to read my private stuff. I'm all the happier to have found Standard Notes. With this tool my life became easier, even if I still miss some features and some of them may never be implemented. But in general I like the idea of Open Source, and for me it's also ok if an open-source projects provides paid features. Good work should give a good income, in my opinion. And therefore I pay to be able to use the available extensions. In addition I also added a sponsorship for the project on Github. It's only 5$ per month, but if more people would do it, such projects could develop faster and their developers would have an easier life. I also support other projects and some artists in a similar way.

2-Factor / Multi-Factor Authentication

What I expect from my colleagues at work I also use in my private life, at least if it's related to information security. This also means, that I also use 2FA/MFA authentication wherever it is possible. But I'm not a fan of purely software-based solutions like the Google Authenticator app, Authy or the builtin 2FA from 1password and similar tools. I'm using a Yubikey from Yubico. Yes, I know the controversial discussions around Yubico. But in general I don't think that you can trust any piece of computer hardware on our planet. Especially U.S. intelligence manipulated hardware too often in the past (what was leaked later by whistleblowers) and by that we should always be skeptical if we buy new hardware, no matter what type of hardware it is. 

In the end my Yubikeys make my life easier and they prevent at least that hackers can log into my accounts even if they get my username and password. What I especially like is the availability of an authenticator app, that reads and stores your account informations on the Yubikey. By that I can use the same authenticator OTP, no matter which device I'm currently working on. As soon as I insert my key or use NFC to connect it to the app on my computer/smartphone/tablet, I see the same accounts in the authenticator app. This app is especially helpful because not all platforms with 2FA also support hardware tokens. Often they provide only an authenticator app interface. In addition I can use my Ubikeys to unlock my computers without entering my password. It needs a little tinkering to make it work on macOS, but the looks I get if I unlock my laptop with a key like other people unlock their cars are priceless. 

Collecting Informations

Anyone who has ever had to do more complex research work knows how quickly you have large amounts of unstructured data that you can quickly lose track of. And sometimes you have to do research work that shouldn't be shared with other people like the employees from your service provider. For that reason I'm using a tool called Yojimbo from Barebones (who also provide the popular BBEdit). It's not the newest piece of software on the market, but I like the idea, that you have a kind of drawer on the side of your desktop where you can simply drop data that you want to preserve. Such data can be a document, an URL, a serial number, a piece of text and so on. Later you can give the data tags to make them easier to search but the builtin search is also very good in finding informations based on their content. However, tags allow to create topic-specific lists of your collected data that can be easily accessed from the left side of the Yojimbo window or from the drawer. If you have any sensitive information, you can encrypt (and decrypt) it with a simple click. Yojimbo is not using any central server to sync data between devices. The data always stays on your computer or your iCloud. And by that you keep control over the data you organize in this tool. 


If you do research work in the WWW, you'll often find informations in languages you cannot speak or understand. Online translators like Google Translate are helpful to extract at least the essence of a text. A little insider tip is a new translation tool from Germany, DeepL.com. Currently, new languages are added from time to time. But what is particularly striking is that the translations provided by DeepL are far better than the translations provided by other providers. What does this have to do with privacy? If you have a Pro account for DeepL, they don't store your inputs, if you don't allow it in your account settings. And because it's a Germany-based company, it's not so easy for them to store PII that users may input. Our privacy laws are very restrictive and the penalties companies must pay if they process data for purposes to which the user hasn't consented can be very painful, even for bigger companies. So if you need to translate a private text that nobody should have except yourself, use a Pro account on DeepL. The translations may be not perfect (for example the German word "Datenträger" is wrongly translated as "data carrier" instead of "storage device/s" or "medium / media"), but they are much better than the results from Google for example.  And Google will always store your input for further purposes that you'll never know. 


A lot of digital communication is done via instant messengers today. Unfortunately this means, that the providers of such messengers are mostly able to read your conversations. There are only some exceptions like Signal, but even they have not the best privacy, because they can connect data about you with your user account. A new solutions comes from Switzerland. It's called TeleGuard. Even if the functionalities are very basic until now, the company behind this messenger (Swisscows) is following a very strict privacy policy. They don't store any informations about your chats on their servers, except you allow them to store backups from your conversations, that you can use to restore them on new devices you want to use. But even that backups are encrypted and decrypted only on your devices. The belonging key never leaves you. However, this also means that you can lose your data completely if you forget the password you used to encrypt the data or to access your account. In this case, even Swisscows will not be able to recover your data.

Btw, Swisscows also provides a search engine that is also focussing on privacy (in addition to child protection, why you cannot find stuff like porn with it). If you need a child-friendly search engine with privacy focus, give it a try. 

Data Storage

Data storage is, in my opinion, one of the most critical infrastructure that we use in our daily life today. Of course we can rely on solutions like Dropbox, Google Drive or iCloud, but in the end we don't know what the providers really do with the data we store on their platforms. An easy solution is to use your own servers. You can either host them at home (an old laptop with an additional external harddisk does fine for most requirements for private use) or you can rent servers in a datacenter that are fully controlled by you. But be careful: never try to operate a server if you don't know anything about system administration. If you rent a server you're also responsible for what is done with that machine. And if a bot is injected to your server and you don't notice it, it can become very expensive if the bot causes any harm to other IT infrastructure. In such a case you should ask somebody with the respective knowledge to setup and manage the server for you. 

Another alternative comes from my ISP, the German Telekom. They provide a cloud storage called "MagentaCloud" with a moderate pricing (500 GB for 5 € / month, for example). And because it's a Germany-based company with their data centers only in Germany, they can provide a very good privacy, because even German intelligence services cannot access the data without the consent of a German court. And also the options for accessing the storage - rsync, scp, SFTP, WebDAV and their apps and the web interface - are good and sufficient for my purposes. An additional security layer can be added by using a tool like Boxcryptor to encrypt all files. 

And all the other stuff

Sometimes I reach points in my work, where available software is not meeting my requirements at all. In such cases I often write small scripts or tools with Python, Go or Perl, that do exactly what I need. If you have the time, I can only recommend to learn any programming language. This enables you to write your own software, if you have specific requirements or special tasks. For example, I built a small software in the past that helps me to analyze the compliance of cloud environments used by our company. I can simply add the required access keys and the software can run in the background until it creates a final report for me, that I can use in different security management tools. But often it are simply one-liners. For example you can Base64-encode the content of a file with a simple: perl -MMIME::Base64=encode_base64 -e 'print encode_base64 join"",<>' < myfile.txt 

And before you ask... my preferred editors for programming are Sublime Text and Emacs with Spacemacs

So that's basically my privacy toolkit for my private life. At work I have, of course, much more privacy-related tools, like software for risk management, to create a data processing activities index, manage security incidents and much more. Maybe I'll write something about my "CISO toolkit" sometime in the future. In the meantime... live long and prosper! 

100 days Challenge

Writing every day is nearly impossible for me, at least at the moment. Too much work has to be done before the launch of our new app. On some days I work around 16-18 hours. Around 7 days left until everything has to be compliant to European and German laws. Nevertheless, I‘ll still try to write some short lines every day.

In around 4 hours I’ll have my next meeting. So I should call it a day and try to get some sleep. Of course this is not an all time condition but rather an exception. After the launch my life will become a little bit more relaxed again.

Preparing the launch of a new app from the (data) security perspective

Our company is currently preparing the launch of a new mobile app. This means stress throughout the whole company. Marketing has to prepare all the campaigns and their tracking, the associated website has to be designed and tested, the management is constantly in contact with various agencies and the funders and that the developers hardly have a quiet minute, you can imagine for sure.

I'm in the exciting position to be able to be involved everywhere. After all, there is hardly any department in which data protection is not somehow involved. Not only does the data of future users have to be stored and processed in accordance with the legal regulations on data protection, for which I check the server setups and make sure that things like encryption are implemented properly, also things like the terms of service, the privacy statements on the website and in the app, contracts and NDAs with external service providers, or how our customer support should handle requests from users in the future are also topics that come on my desk. Even the marketing department has to reckon with me rapping them on the knuckles if they try to link personal data with their tracking data, what I also monitor. And yes, also the cleaning women in our office is not allowed to access all areas in the office and I'm responsible to ensure she can't do it by defining rules for our employees how they have to handle access requests from other departments or strangers and ensuring appropriate locking systems are installed in the doors and the HR department handles the key distribution correctly.

In short, I have to have my eyes and ears pretty much everywhere. Beside that, I'm creating the data processing activities index, document data flows, creating the data protection impact assessment for the app, and also my usual activities like checking and updating our internal policies and concepts, the risk management for our company (not only for data security but also for business continuity and so on), auditing the IT systems we use etc. need to be done.

But that's exactly what I enjoy so much about my position. Whereas I used to focus almost exclusively on the IT department (apart from the startups I was involved with in the early stages, where you always have to help out in other areas anyway) in my previous positions, in my role as CISO I gain insight into all areas of our company, starting with our office management and ending with the top management, of which I'm now also a part of. I find it exciting to get this overall view of how a company like this works and how all the employees work together like the parts of a well-oiled clockwork. And at the end of the day, you look at the day's work in amazement and see how much has been accomplished in so few hours. If you ever want to run your own company, I can recommend working in the security department of an IT company for a while. Afterwards, you will have much more understanding for the worries and fears of the employees, from the cleaning lady up to the department heads and C-level managers.

FF-Sec says hello

Hello world!

With the output of this line, many people start programming nowadays. Looks like a good start for a blog to me, too. ;)

When I dove into the world of computers over 20 years ago, I could not have guessed the journey I would begin. It was not my first contact with a computer (a KC-85 from the GDR in our school was the first computer I used), but the real journey began when I first installed a Linux on my 386 PC, because Windows (3.11) and gaming bored me. On it I learned my first programming language... C. Ok, not really my first one, because I already wrote an application in a BASIC dialect on the KC-85, but it was mostly copying it from a piece of paper and my understanding for it was like the language was named… very basic. ;)

Of course my first application in C was also a kind of "hello world“, but I added already a condition dependent on a variable I added. But for me it was breathtaking, because I experienced for the first time, how it feels if my machine does what I wanted. Programming became the drug of my choice. Shortly after that I learned assembler, because I wanted to understand all parts of the Linux kernel. Along the way I learned Perl to automate things on my computer. And already 2 years later - 2 years in which I spent about 12-18 hours a day in front of my computer - I started to work as a freelance system administrator and set up Linux servers for different companies. Then at some point a company came along that really wanted to take me on as a permanent employee and made me a good offer. And from that point on, my career went from system administrator to system engineer to DevOps engineer to security engineer and to IT security manager, because data protection and IT security always were important topics for me. And now I'm the CISO of a company in Germany and take care of all the information security topics in our company.

In fact, I never did any training in computer science (instead I trained as a laboratory chemist). I taught myself everything, partly from books, but mostly by trying out and reading the manpages built into Linux and from informations I found in the WWW.

And here I am, in my mid-40s, still addicted to computers. Although I still deal with Linux servers, I now prefer to use macOS as my desktop system. After all, it's also only a Unix. And I often had to deal with Unix during my career. FreeBSD and Solaris were my daily companions at times. About a year ago, my father gave me a Windows PC, because he couldn't do anything with Windows 10 and wanted his XP back. He is nearly 80 years old now and never had internet access. Therefore it’s ok. And I can only say... WTF?! How did such a static, inflexible and untidy system ever make it in the market? As was the case with Windows 3.1(1), it is somehow only useful for gaming. And that's exactly why the PC is now sitting around in my apartment. You can certainly imagine how often it is turned on. It should be about 4-5 months since I used it the last time. :D

What you can expect in my blog? I have absolutely no idea. :D About 10 years ago, when I worked for the blog.de platform (at that time still quite a big blogging platform in Germany, later sold to an Italian company that fucked it up), I blogged regularly. After that I wrote irregularly on different platforms. And even today I still have 2 blogs where I post my pictures that I draw on the iPad and articles around topics that interest me. But I like the idea to write a blog with a note taking app. Somehow it feels like this is how blogging should work. No distraction by creating designs and similar stuff. Simply writing what comes to my mind. And that’s what you can expect here… whatever comes to my mind. It may be about topics related to data protection and information security (yes, for me there is a difference, perhaps I’ll explain it somedays), thoughts about drawing with ProCreate on iPad, my adventures while geocaching (yes, sometimes even I leave my home to face the real world), about what I learn as a dad from my (autistic) children, and it’s a lot what we can learn from children, or any other topics. We’ll see.

Let’s see where this journey leads me to. If my English isn’t perfect, please be indulgent. Even after 20 years I’m not really good at it. But I try my best and I hope my posts are understandable. So… let’s have some fun! See you soon!