Certificate Authority setup: Doing it right with OpenSSL
Friday, December 27. 2013
In my previous post about securing HTTP-connections HTTP Secure: Is Internet really broken? I was speculating about the current state of encryption security in web applications. This article is about how to actually implement a CA in detail and the requirements of doing so.
During my years as server administrator, I've been setting up CAs into Windows domains. There many of the really complex issues I'm describing below have been handled so that you don't even notice them. I'm sure that using a commercial CA-software on Linux will yield similar results, but I just haven't really tested them. This article is about implementing a properly done CA with OpenSSL sample minimal CA application. Out-of-the-box it does many things minimally (as promised), but it is the most common CA. It ships with every Linux distro and the initial investment price (0 €) is about right.
Certificate what?
In cryptography, a certificate authority or certification authority (CA), is an entity that issues digital certificates.
— Wikipedia
Why would one want to have a CA? What is it good for?
For the purpose of cryptograpic communications between web servers and web browsers, we use X.509 certificates. Generally we talk about SSL certificates (based on the original, now obsoleted standard) or TLS certificates based on the currently valid standard. The bottom line is: when we do HTTP secure or HTTPS we will need a certificate issued by a Certficiate Authority.
Historically the HTTPS was developed by Netscape back in the 1994. They took X.509 as starting poing and adapted it to function with HTTP. A standard which was developed by well known Tim Berners-Lee in CERN couple of years before that. The RFC 5280 describes X.509. It has funny title "Internet X.509 Public Key Infrastructure Certificate and Certificate Revocation List (CRL) Profile", but is the current version.
Wikipedia says: "In cryptography, X.509 is an ITU-T standard for a public key infrastructure (PKI) and Privilege Management Infrastructure (PMI). X.509 specifies, amongst other things, standard formats for public key certificates, certificate revocation lists, attribute certificates, and a certification path validation algorithm."
So essentially X.509 just dictates what certificates are and how they can be issued, revoked and verified. Netscape's contribution simply was very important at the time as they added encryption of HTTP data being transferred. Contribution was so important, that the invention is still used today, and nobody has proposed anything serious to replace that.
So, enough history... let's move on.
Encryption: The ciphers
It is also important to understand, that a certificate can be used to encrypt the transmission using various cipher suites. Run openssl ciphers -tls1 -v to get a cryptic list of all supported ones. Examples:
RC4-MD5 SSLv3 Kx=RSA Au=RSA Enc=RC4(128) Mac=MD5
DES-CBC3-SHA SSLv3 Kx=RSA Au=RSA Enc=3DES(168) Mac=SHA1
AES128-SHA SSLv3 Kx=RSA Au=RSA Enc=AES(128) Mac=SHA1
AES256-SHA256 TLSv1.2 Kx=RSA Au=RSA Enc=AES(256) Mac=SHA256
The actual list is very long, but not all of them are considered secure, see the chart in Wikipedia about some of the secure and insecure ciphers.
The columns are as follows:
- Name (or identifier)
- The protocol version which added the support
- Kx=: Key exchange method, RSA, DSA, Diffie-Hellman, etc.
- Au=: Authentication method, RSA, DSA, Diffie-Hellma or none
- Enc=: Encryption method with number of bits used
- Mac=: Message digest algorithm, MD5 or SHA1
If not counting any possible bugs, a number of flaws have been found in TLS/SSL, especially if using weak cipher suites. One of the recent and well known is the BEAST attack which can easily be mitigated by limiting the available cipher suites. If the BEAST is successfully applied, it can decrypt cookies from request headers. Thus, it is possible for somebody eavesdropping to gain access to your account even if you're using HTTPS. So essentially you've been doing exactly what every instruction about safe browsing tells you to, you've been using HTTPS, but still your traffic can be seen by observing parties.
Please note that when talking about cipher suites, there are two different entities which have the number of bits: server key and encryption key. The server key is the server's public/private -key pair, it can have for example 2048 bits in it. The server key is used only to transmit the client generated "real" encryption key to server, it can have for example 256 bits in it. Server key is typically never renewed, on the other hand the encryption key can have a lifetime of couple seconds.
The easy way: Self-signed certificates
Ok. Now we have established that there is a need for certificates to encrypt transmission and to issue a false sense of identified verification about the other party. Why not go the path-of-least-resistance? Why bother doing the complex process of building a CA because it is very easy to issue self-signed certificates. A self-signed certificate provides the required encryption, your browser may complain about invalid certificate, but even that complaint can be suppressed.
Most instruction tell you to run some magical openssl-commands to get your self-signed certificate, but in reality there exists a machine for that Self-Signed Certificate Generator. It doesn't get any easier than that.
My personal recommendation is to never use self-signed certificates for anything.
An excellent discussion about when them could be used can be found from Is it a bad practice to use self-signed SSL certificates? and What's the risk of using self-signed SSL? and When are self-signed certificates acceptable?
Practically a self-signed certificate is slipped into use like this:
- A lazy developer creates the self-signed certificate. He does it for himself and he chooses to ignore any complaints by his browser. Typical developers don't care about the browser complaints anyway nor they know how to suppress them.
- In the next phase the error occurs: the software goes to internal testing and number of people start using the bad certificate. Then they add couple of customer's people or external testers.
- Now a bunch of people have been trained to to pay any attention about certificate errors. They carry that trait for many decades.
I've not seen any self-signed certificates in production, so typically a commercial is eventually purchased. What puzzles me is that why don't they set up their own CA, or get a free certificate from CAcert or StartSSL.
CA setup
How to actually set up your own CA with OpenSSL has been extensively discussed since ever. One of the best instructions I've seen is by Mr. Jamie Nguyen. His a multi-part instructions are: How to act as your own certificate authority (CA), How to create an intermediate certificate authority (CA) and How to generate a certificate revocation list (CRL) and revoke certificates.
I'd also like to credit following as they helped me with setting up my own CA:
- Howto: Make Your Own Cert And Revocation List With OpenSSL
- OpenSSL CA and non CA certificate
- Generate a root CA cert for signing, and then a subject cert
Other non-minimal CAs which may be interesting, but I didn't review them:
The good way: The requirements for a proper CA
My requirements for a proper CA are:
- 2-level CA (This is what Jamie instructed):
- Root CA: For security reasons, make sure this is not in the same machine as intermediate CA. Typically this needs to be protected well and is considered off-line, except for the weekly revocation list updates.
- Intermediate: The active CA which actually issues all certificates
- Passes Windows certutil.exe -verification
- All certificates are secure and have enough bits in them
- A certificate requested for www.domain.com implicitly also has domain.com in it
- A certificate identifies the authority that issued it
- A certificate has location of revocation information in it
- CA certificates (both root and intermediate) identify themselves as CA certificates with proper usage in them
Verifying an issued certificate
In the requirements list there is only one really difficult thing: to get Windows certutil.exe to verify an issued certificate. The reason for that is, because pretty much all of the requirements must be met in order to achieve that.
A successful run is very long, but will look something like this (this is already a reduced version):
PS D:\> C:\Windows\System32\certutil.exe -verify -urlfetch .\test.certificate.crt
Issuer:
C=FI
Subject:
CN=test.certificate
C=FI
Cert Serial Number: 1003
-------- CERT_CHAIN_CONTEXT --------
ChainContext.dwInfoStatus = CERT_TRUST_HAS_PREFERRED_ISSUER (0x100)
ChainContext.dwRevocationFreshnessTime: 24 Days, 20 Hours, 35 Minutes, 29 Seconds
SimpleChain.dwInfoStatus = CERT_TRUST_HAS_PREFERRED_ISSUER (0x100)
SimpleChain.dwRevocationFreshnessTime: 24 Days, 20 Hours, 35 Minutes, 29 Seconds
CertContext[0][0]: dwInfoStatus=102 dwErrorStatus=0
Element.dwInfoStatus = CERT_TRUST_HAS_KEY_MATCH_ISSUER (0x2)
Element.dwInfoStatus = CERT_TRUST_HAS_PREFERRED_ISSUER (0x100)
---------------- Certificate AIA ----------------
Verified "Certificate (0)" Time: 0
---------------- Certificate CDP ----------------
Verified "Base CRL (1005)" Time: 0
CertContext[0][1]: dwInfoStatus=102 dwErrorStatus=0
Element.dwInfoStatus = CERT_TRUST_HAS_KEY_MATCH_ISSUER (0x2)
Element.dwInfoStatus = CERT_TRUST_HAS_PREFERRED_ISSUER (0x100)
---------------- Certificate AIA ----------------
Verified "Certificate (0)" Time: 0
---------------- Certificate CDP ----------------
Verified "Base CRL (1008)" Time: 0
---------------- Base CRL CDP ----------------
No URLs "None" Time: 0
---------------- Certificate OCSP ----------------
No URLs "None" Time: 0
--------------------------------
CRL 1006:
CertContext[0][2]: dwInfoStatus=10a dwErrorStatus=0
Element.dwInfoStatus = CERT_TRUST_HAS_KEY_MATCH_ISSUER (0x2)
Element.dwInfoStatus = CERT_TRUST_IS_SELF_SIGNED (0x8)
Element.dwInfoStatus = CERT_TRUST_HAS_PREFERRED_ISSUER (0x100)
---------------- Certificate AIA ----------------
No URLs "None" Time: 0
---------------- Certificate CDP ----------------
Verified "Base CRL (1008)" Time: 0
---------------- Certificate OCSP ----------------
No URLs "None" Time: 0
--------------------------------
------------------------------------
Verified Issuance Policies: None
Verified Application Policies:
1.3.6.1.5.5.7.3.1 Server Authentication
Cert is an End Entity certificate
Leaf certificate revocation check passed
CertUtil: -verify command completed successfully.
The tool runs the entire certification chain from leaf [0][0] into intermediate CA [0][1] and to root CA [0][2] and reads the information, does the URL fetching and confirms everything it sees, knows about and learns about. Your setup really needs to be up to par to do that!
NOTE:
Your CA root certificate must be loaded into Window's machine account. The verify does not work properly, if you install the root certificate to user store. Something like this run with administrator permissions will fix the verify:
C:\Windows\System32\certutil.exe -enterprise -addstore root Root-CA.cer
You'll just need to download the certificate file Root-CA.cer (in PEM-format) as a prerequisite.
The setup
I'm not going to copy here what Jamie instructed, but after you've done the basic setup my work starts.
The key to achieve all my requirements is in the v3 extensions. RFC 5280 lists following standard extensions: Authority Key Identifier, Subject Key Identifier, Key Usage, Certificate Policies, Policy Mappings, Subject Alternative Name, Issuer Alternative Name, Subject Directory Attributes, Basic Constraints, Name Constraints, Policy Constraints, Extended Key Usage, CRL Distribution Points, Inhibit anyPolicy and Freshest CRL (a.k.a. Delta CRL Distribution Point). Also the private internet extensions: Authority Information Access and Subject Information Access are worth noting. I highlighted the good ones.
What I have is an own Bash-script for approving a request on my intermediate CA. A fragment of it contains generation of openssl.cnf appendix:
# NOTE: Used for web certificate approval
[ v3_web_cert ]
# PKIX recommendation.
subjectKeyIdentifier=hash
authorityKeyIdentifier=keyid:always,issuer
# PKIX recommendation.
basicConstraints=CA:FALSE
# Key usage
# @link http://www.openssl.org/docs/apps/x509v3_config.html
keyUsage = digitalSignature, keyEncipherment, keyAgreement
extendedKeyUsage = serverAuth
authorityInfoAccess = caIssuers;URI:http://ca.myown.com/Intermediate-CA.cer
crlDistributionPoints = URI:http://ca.myown.com/Intermediate-CA.crl
subjectAltName = @alt_names
[alt_names]
DNS.1 = ${SUBJECT}
DNS.2 = ${ALT_NAME}
After generation, it is used in the script like this:
openssl ca -config ${CA_ROOT}/openssl.cnf \
-keyfile ${CA_ROOT}/private/intermediate.key.pem \
-cert ${CA_ROOT}/certs/intermediate.cert.pem \
-extensions v3_web_cert -extfile cert.extensions.temp.cnf \
-policy policy_anything -notext -md sha1 \
-in Input_file.req
On my root CA I have for intermediate CA certificate:
# Used for intermediate CA certificate approval
[ v3_int_ca ]
# PKIX recommendation.
subjectKeyIdentifier=hash
authorityKeyIdentifier=keyid:always,issuer
# PKIX recommendation.
basicConstraints = critical,CA:true
# Key usage: this is typical for a CA certificate
# @link http://www.openssl.org/docs/apps/x509v3_config.html
keyUsage = critical, keyCertSign, cRLSign
authorityInfoAccess = caIssuers;URI:http://ca.myown.com/Root-CA.cer
crlDistributionPoints = URI:http://ca.myown.com/Root-CA.crl
Also on root CA I have for the root certificate itself:
# Used for root certificate approval
[ v3_ca ]
# PKIX recommendation.
subjectKeyIdentifier=hash
authorityKeyIdentifier=keyid:always,issuer
# PKIX recommendation.
basicConstraints = critical,CA:true
# Key usage: this is typical for a CA root certificate
# @link http://www.openssl.org/docs/apps/x509v3_config.html
keyUsage = critical, digitalSignature, keyEncipherment, keyAgreement, keyCertSign, cRLSign
crlDistributionPoints = URI:http://ca.myown.com/Root-CA.crl
Notice how root CA certificate does not have AIA (Authority Information Access) URL in it. There simply is no authority above the root level. For intermediate CA there is and that is reflected in the issued certificate. The CRL distribution points (Certificate Revocation Lists) are really important, that's the basic difference between the "minimum" and properly done CAs.
Your next task is to actually make sure, that the given URLs have the indicated files in them. If you didn't get it: you will need a web server which can be accessed by your web browser to retrieve the files. The revocation lists can be auto-generated daily or weekly, but it has a limited life span. Out-of-the-box OpenSSL has in its .cnf-file: default_crl_days= 30 So a month is the absolute maximum.
I'll compile the list the 4 URLs here for brevity:
- Certificate locations:
- authorityInfoAccess = caIssuers;URI:http://ca.myown.com/Root-CA.cer
- authorityInfoAccess = caIssuers;URI:http://ca.myown.com/Intermediate-CA.cer
- Revocation list locations:
- crlDistributionPoints = URI:http://ca.myown.com/Root-CA.crl
- crlDistributionPoints = URI:http://ca.myown.com/Intermediate-CA.crl
For the simplicity I'll keep the public files in a single point. My CRL generation script verifies the existing certificates and creates alerts if old ones do not match the current ones. Perhaps somebody changed them, or I did a mistake myself.
Summary
The requirements have been met:
- 2-level CA
- Pass!
- Passes Windows certutil.exe -verify
- Pass!
- All certificates are secure and have enough bits in them
- Pass!
- The server keys are 2048 bits minimum
- The web servers where the certificates are used have
in them.SSLCipherSuite ECDHE-RSA-AES128-SHA256:AES128-GCM-SHA256:RC4:HIGH:!MD5:!aNULL:!EDH
- A certificate requested for www.domain.com also has domain.com in it
- Pass!
- A script creates the configuration block dynamically to include the subjectAltName-list
- A certificate identifies the authority that issued it
- Pass!
- All certificates have the authorityInfoAccess in them
- A certificate has location of revocation information in it
- Pass!
- Intermediate and issued certificates have crlDistributionPoints in them
- CA certificates (both root and intermediate) identify themselves as CA certificates with proper usage in them
- Pass!
- CA certificates have basicConstraints=CA:TRUE in them
That's it! Feel free to ask for any details I forgot to mention here.
My experience is that only 1% of the system admins have a rough understanding of this entire issue. My favorite hobby of cryptography, VPN-tunnels and X.509 certificates don't seem to light nobody else's fire like they do for me.
HTTP Secure: Is Internet really broken?
Wednesday, December 25. 2013
What is HTTP Secure
HTTP secure (or HTTPS) is HTTP + encryption. For the purpose of cryptograpic communications between web servers and people accessing them, we use X.509 certificates issued by a Certificate Authority.
In cryptography, a certificate authority or certification authority (CA), is an entity that issues digital certificates.
— Wikipedia
Using HTTPS and assuming its safe to do so
One of the most famous HTTPS critics is Mr. Harri Hursti.
He is pretty famous computer security expert and has succesfully hacked
a number or things. He is currently works in his own company SafelyLocked.
#SSL is so insecure it’s only worth a post-it note from the NSA.
- Harri Hursti in Slush 2013 as relayed by Twitter
During an interview in Finnish newspaper Talouselämä he said:
SSL, one of the cornerstones in network security is broken and it cannot be fixed. It cannot be trusted.
That's a stiff claim to make.
What HTTPS certificates are for
I'm quoting a website http://www.sslshopper.com/article-when-are-self-signed-certificates-acceptable.html here:
SSL certificates provide one thing, and one thing only: Encryption between the two ends using the certificate.
They do not, and never been able to, provide any verification of who is on either end. This is because literally one second after they are issued, regardless of the level of effort that goes into validating who is doing the buying, someone else can be in control of the certificate, legitimately or otherwise.
Now, I understand perfectly well that Verisign and its brethren have made a huge industry out of scamming consumers into thinking that identification is indeed something that a certificate provides; but that is marketing illusion and nothing more. Hokum and hand-waving.
It all comes down to, can you determine that you are using the same crypto key that the server is? The reason for signing certificates and the like is to try to detect when you are being hit with a man-in-the-middle attack. In a nutshell, that attack is when you try to open a connection to your 'known' IP address, say, 123.45.6.7. Even though you are connecting to a 'known' IP address of a server you trust, doesn't mean you can necessarily trust traffic from that IP address. Why not? Because the Internet works by passing data from router to router until your data gets to it's destination.
Every router in between is an opportunity for malicious code on that router to re-write your packet, and you'd never know the difference, unless you have some way to verify that the packet is from the trusted server.
The point of a CA-signed certificate is to give slightly stronger verification that you are actually using the key that belongs to the server you are trying to connect to.
Long quote, huh. The short version of that is: Not to assume that if somebody has been verified has somebody or something, that they really are. You can safely assume, that traffic between that party is encrypted, but you really don't know who has the key to decrypt.
If we ignore the dirty details of encryption keys and their exchange during encrypted request initiation and concentrate only on the encryption cipher suite, not all of the cipher suites used are secure enough. There exists pretty good ones, but the worst ones are pretty crappy and cannot be trusted at all.
Argh! Internet is broken!
To add credit Mr Hursti's claims I'll pick the cases of Comodo and DigiNotar. DigiNotar didn't survive the crack, their PR failed so miserably that they lost all of their business.
On March 15th 2011 an affiliate company was used to create a new user
account into Comodo's system. That newly created account was used to
issue a number of certificates: mail.google.com, www.google.com, login.yahoo.com, login.skype.com, addons.mozilla.org and login.live.com. The gain of having those domains is immediately clear. If combined with some other type of attack (most likely DNS cache poisoning)
a victim can be lured into a rogue site for Google, Yahoo, Microsoft
and Mozilla without them ever knowing that all of their data is
compromised. Some of the possible attack vectors include injecting rogue
software updates to victims' computers.
On July 1st 2011 DigiNotar's CA was compromised by allegedly Iranian crackers. The company didn't even notice the crack at until 19th of July. At the time there had been a valid certificate issued to Google in the wild. Again, these false certificates were combined with some other type of security flaw (most likely a DNS poisoning) and redirected victims to access a fake "Google" site without them ever known that all the passwords and transmitted data was leaked.
Both of those cases scream: DON'T use certificates for verification. But in reality Microsoft Windows, Skype and Mozilla Firefox are built so that if a certificate on the other end is verified by the connecting client, all is good. The remote party is valid, trusted and verified. Remember from earlier: "They do not, and never been able to, provide any verification of who is on either end". Still a lot of software is written to misuse certificates like the examples above do.About trust in Internet
From the above incidents we learn the following: Who do you trust in Internet? Who would you consider a trustworthy party, so that you'd assume that what the other party says or does is so? Specifically in Internet context, if a party identifies themselves, do you trust that they are who they claim to be? Or if a party says, that they'll take your credit card details and agree to deliver goods which you purchased back to you, do you trust that they'll keep their promise?
Lot of questions, no answers. Trust is a complicated thing after all. Most people would say, that they'll trust major service providers like Google or Microsoft or Amazon, but frown upon smaller ones with no reputable history to prove their actions. The big companies allegedly work with NSA and gladly deliver your details and behavior for US Government agency to study if you're about to commit serious crime. So you choose to trust them, right? No.
Well then, are smaller parties any safer? Nobody really knows. They can be, or then again not.
Regular users really don't realize, that the vendor of their favorite web browser and operating system has chosen for you as an user of their software to trust a number of other parties. Here is a glimpse of the list of trusted certification authorities from Mozilla Firefox:
I personally don't trust a company with AddTrust AB or America Online Inc. or CNNIC or ... the list goes on. I don't know who they are, what they do or why should they be worth my trust. All I know for a fact that they paid Mozilla money to include their root certificate in Mozilla's software product to establish implicit trust between them and me, when I access a website they chose to issue a certificate to. That's a tall order!
There really is no reliable way of identifying the other party you're talking or communicating to in the Internet. Many times I wish there was, but since the dawn of time, Internet has been designed to be an anonymous place and that's the way it stays until drastic changes are introduced into it.
Conclusions
No, The Internet is not broken, despite what Mr. Hursti claims. The Internet is much more than HTTP or encryption or HTTP + encryption. For the purpose of securing HTTP we'll just need some secure method of identification for the other party. Critical applications would be on-line banking and on-line shopping. Optimally all traffic would be encrypted all the time (see HTTP 2.0 details), but it is an overkill after all. Encryption adds extra bytes to transfers and for example most large files don't contain any information that would need to be hidden. When moving credit card data or passwords or such, the amount of bytes transferred is typically small in comparison to downloading system updates or install images.
Also I'd like to point out that to keep a connection secure, it's mostly about the keys. Keeping the keys safe and exchanging keys safely. The actual ongoing encryption during a session can be thought to be secure enough to be trusted. However, a great deal of thought must be put into allowing or disallowing a cipher suite. And still: all that is based on current knowledge. Once a while some really smart people find out that a cipher that was thought to be secure isn't because of a flaw or mathematical breakthrough which renders a cipher suite not secure enough to be used.
We need encrypted HTTP but not the way they implemented it in 1994. It is not the task of Verisign (or any other CA-issuer) to say: "The other party has been verified by us, they are who they claim to be". Also there must be room for longer and longer keys and new cipher suites to keep the secure Internet reliable.