There’s also potential investment funding for good ideas:
There’s also potential investment funding for good ideas:
Schedule Windows applications that don’t support being Windows services as Windows services.
ZeroMQ (0MQ) is a beautiful library that basically replaces the socket layer with a very thin, pattern-based wrapper. Aside from removing this overhead from your code, 0MQ also usually gives you the guarantee that one read will return one message (or one part of a multipart message).
gevent is a coroutine-based networking library for Python. Coroutines allow you to leverage the blocking that certain types of operations, like network requests, to perform other operations while waiting (works best when you’re doing a number of similar operations in parallel). It’s a compromise that allows you to speed up synchronous operations to the point of being comparable to multithreading (at least in the case of network operations).
There was a point at which ZeroMQ didn’t support this (and a package named gevent_zmq had to be used), but it has since become compatible with it.
For example, a server:
import gevent import zmq.green as zmq _BINDING = 'ipc:///tmp/test_server' context = zmq.Context() def server(): server_socket = context.socket(zmq.REP) server_socket.bind(_BINDING) while 1: received = server_socket.recv() print("Received:\n[%s]" % (received)) print('') server_socket.send('TestResponse') server = gevent.spawn(server) server.join()
The corresponding client:
import gevent import zmq.green as zmq _BINDING = 'ipc:///tmp/test_server' context = zmq.Context() def client(): client_socket = context.socket(zmq.REQ) client_socket.connect(_BINDING) client_socket.send("TestMessage") response = client_socket.recv() print("Response:\n[%s]" % (response)) print('') client = gevent.spawn(client) client.join()
Displaying the output here would nearly be redundant, given that the result should be plainly obvious.
Recently, Apple released an update that broke every C-based Python package, and probably more (the “-mno-fused-madd” compilation error). See here for more information.
To fix, add these to your environment:
export CFLAGS=-Qunused-arguments export CPPFLAGS=-Qunused-arguments
Then, add this to your sudoers file (via visudo), which will allow sudo to have access to those variables:
Defaults env_keep += "CFLAGS CPPFLAGS"
Signed SSL certificates have a feature known as “extensions”. In order for them to be there, they must be in the CSR. Therefore, CSR’s support them too. Although X.509 certificates are not meant for a lot of data and were never meant to act as databases (rather, an identity with associated information), they act as a great solution when you need to store secured information alongside your application at a client site. Though the data is viewable, you have the following guarantees:
As long as you don’t care about keep the data secret, this makes extensions an ideal solution to a problem like on-site client licenses, where your software needs to regularly check whether the client still has permission to operate. You can also use a CRL to disable them if they stop paying their bill.
These extensions accommodate data that goes beyond the distinguished-name (DN) fields (locality, country, organization, common-name, etc..), chain of trust, key fingerprints, the signatures that guarantee the trustworthiness of the certificate (using the signature of the CA), and the integrity of the certificate (the signature of the certificate contents). Extensions seem relatively easy to add to certificates, whether you’re creating CSRs from code or from command-line. They’re just manageably-sized strings (though it technically seems like there is no official length limit) of human-readable text.
If you own the CA, then you might also create your own extensions. In this case, you’ll refer to your extension with a unique dotted identifier called an “OID” (we’ll go into this in the ASN.1 explanation below). Libraries might have trouble if you just refer to your own extension without properly registering it with your library prior. For example, OpenSSL has the ability to register and use custom extensions, but the M2Crypto SSL library doesn’t expose the registration call, and, therefore, can’t use custom extensions.
Unsupported extensions might be skipped or omitted from the signed certificate by a CA that doesn’t recognize/support them, so beware that you’ll need to stick to the popular extensions if you can’t use your own CA. Extensions that are mandatory for you requirements can be marked as “critical”, so that signing won’t precede if any of your extensions aren’t recognized.
The extension that we’re interested in, here, is “subjectAltName”, and it is recognized/supported by all CAs. This extension can describe the “alternative subjects” (using DNS-type entries) that you might need to specify if your X.509 needs to be used with more than one common-name (more than one hostname). It can also describe email-addresses and other kinds of identity information. However, it can also store custom text.
This is an example of two “subjectAltName” extensions (multiple instances of the same extensions can be present in a certificate):
DNS:server1.yourdomain.tld, DNS:server2.yourdomain.tld otherName:126.96.36.199.4.1.99;UTF8:This is arbitrary data.
However, due to details soon to follow, it’s very difficult to pull the extension text back out, again. In order to go further, we have to take a quick diversion into certificate structure. This isn’t necessarily required, but it is information that is obscure-enough to find that you won’t have any coping skills if you encounter issues, otherwise.
All of the standard, human-readable, SSL documents, such as the private-key, public-key, CSR, and X.509, are encoded in a format called PEM. This is base64-encoded data with anchors (e.g. “—–BEGIN DSA PRIVATE KEY—–“) on the top and bottom.
In order to have any use, a PEM-encoded document must be converted to a DER-encoded document. This just means that it’s stripped of the anchors and newlines, and then base64-decoded. DER is a tighter subset of “BER” encoding.
The DER-encoding describes an ASN.1 data structure node. ASN.1 combines a tree of data with a tree of grammar specifications, and reduces down to hierarchical sets of DER-encoded data. All nodes (called “tags”) are represented by dot-separated identifiers called OIDs (mentioned above). Usually these are officially-assigned OIDs, but you might have some custom ones if you don’t have to pass your certificates to higher authority that might have a problem with them.
In order to decode the structure, you must walk it, applying the correct specs as required. There is nothing self-descriptive within the data. This makes it fast, but useless until you have enough pre-existing knowledge to descend to the information you require.
The specification for the common grammars (like RFC 2459 for X.509) in ASN.1 is so massive that you should expect to avoid getting involved in the mechanics at all costs, and to learn how to survive with the limited number of libraries already available. In all likelihood, a need for anything outside the realm of popular usage will require a non-trivial degree of debugging.
ASN.1 has been around… for a while (about thirty years, as of this year). It’s obtuse, impossible, and not understood in great deal by very few individuals. However, it’s going to be here for a while.
The reason that extensions are tough to decode is because the encoding depends on the text that you put in the extension. Specifically, the “otherName” and “UT8” parts. OpenSSL can’t present these values when it dumps the certificate, because it just doesn’t have enough information to decode them. M2Crypto, since it uses OpenSSL, has the same problem.
Now that we’ve introduced a little of the conceptual ASN.1 structure, let’s go back to the previous subjectAltName “otherName” example:
otherName:188.8.131.52.4.1.99;UTF8:This is arbitrary data.
The following is the breakdown:
I used the following calls to M2Crypto to add these extensions to the X.509:
ext = X509.new_extension( 'subjectAltName', 'otherName:184.108.40.206.4.1.99;UTF8:This is arbitrary data.' ) ext.set_critical(1) cert.add_ext(ext)
Aside from the extension information itself, I also indicate that it’s to be considered “critical”. Signing will fail if the CA doesn’t recognize the extension, and not simply omit it. When this gets encoded, it’ll be encoded as three separate “components”:
It turns out that it’s quicker to use a library that specializes in ASN.1, rather than trying to get the information from OpenSSL. After all, it’s out-of-scope as it’s colocated with cryptographical data while not being cryptographical itself.
I used pyasn1.
To decode the string from the previous extension:
This is a dump of the structure using pyasn1:
SubjectAltName(). setComponentByPosition( 0, GeneralName(). setComponentByPosition( 0, AnotherName(). setComponentByPosition( 0, ObjectIdentifier(220.127.116.11.18.104.22.168.99) ). setComponentByPosition( 1, Any(hexValue='0309006465616462656566') ) ) )
The process might seem easy, but this took some work (and collaboration) to get right, with the primary difficulty coming from obscurity meeting unfamiliarity. However, the process should be somewhat set in stone, every time.
This is the corresponding code. “cert” is an M2Crypto X.509 certificate:
cert, rest = decode(cert.as_der(), asn1Spec=rfc2459.Certificate()) extensions = cert['tbsCertificate']['extensions'] for extension in extensions: extension_oid = extension.getComponentByPosition(0) print("0 [%s]" % (repr(extension_oid))) critical_flag = extension.getComponentByPosition(1) print("1 [%s]" % (repr(critical_flag))) sal_raw = extension.getComponentByPosition(2) print("2 [%s]" % (repr(sal_raw))) (sal, r) = decode(sal_raw, rfc2459.SubjectAltName()) gn = sal.getComponentByPosition(0) an = gn.getComponentByPosition(0) oid = an.getComponentByPosition(0) string = an.getComponentByPosition(1) print("[%s]" % (oid)) # Decode the text. s, r = decode(string, rfc2459.UTF8String()) print("Decoded: [%s]" % (s)) print('')
I wanted to provide an end-to-end tutorial in adding and retrieving “otherName”-type “subjectAltName” extensions because none currently exists. It’s a good solution for keeping data safe on someone else’s assets (as long as you don’t overburden the certificate with extensions, as it’ll decrease the efficiency to verify).
Don’t forget to implement the CRL/CDP, or you won’t have the possibility of faulting the certificate (and its extensions) without having to wait for them to expire.
I don’t often need to read or write archives from code. When I do, and I don’t want to call a tool via shell-commands, I’ll use zip-files. Obviously there are better formats out there, but when it comes to library compatibility, tar and zip are the easiest possible formats to manipulate. If you’re desperate, you can even write a quick tar archiver with relative simplicity (the headers are mostly ASCII).
Obviously, the emphasis here has been on availability. My preferred format is 7-Zip (which uses LZMA compression). Though you don’t often see 7-Zip archives for download, I’ve been using this format for eight-years and haven’t looked back. The compression is good and the tool is every bit as easy as zip.
Unfortunately, there’s limited support for 7-Zip in Python. To the best of my knowledge, only the libarchive Python package can read and write 7-Zip archives. The libarchive Python package is developed and supported separately from the C library that it implements.
Though the library is structured to support any format that the libarchive library can (all major formats, and probably all of the minor ones), the Python project is outrightly labeled as a work-in-progress. 7-Zip is the only format explicitly supported for both reading and writing. Fortunately, it also supports libarchive‘s autodetection functionality. So, you can read/expand any archive, as long as you can afford the extra couple of milliseconds that the detection will cost you.
The focus of this project is to provide elegant archiving routines. Most of the API functions are implemented as generators.
To enumerate the entries in an archive:
import libarchive with libarchive.reader('test.7z') as reader: for e in reader: # (The entry evaluates to a filename.) print("> %s" % (e))
To extract the entries from an archive to the current directory (like a normal, Unix-based extraction):
import libarchive for state in libarchive.pour('test.7z'): if state.pathname == 'dont/write/me': state.set_selected(False) continue # (The state evaluates to a filename.) print("Writing: %s" % (state))
To build an archive from a collection of files (omit the target for stdout):
import libarchive for entry in libarchive.create( '7z', ['/aa/bb', '/cc/dd'], 'create.7z'): print("Adding: %s" % (entry))
Use the safe-rm utility to replace your system rm binary with a version that checks the path against a blacklist before continuing.