Use GPG to Quickly Encrypt at the Command-Line


$ echo "cleartext" | gpg --passphrase "some-passphrase" -c --no-use-agent > text.encrypted
$ cat text.encrypted | gpg --passphrase "passphrase" --no-use-agent 2>/dev/null
$ cat text.encrypted | gpg --passphrase "some-passphrase" --no-use-agent 2>/dev/null
cleartext

Without “–no-use-agent”, you might very well be prompted by some system keyring/agent every time.

Resolving Go Import URLs

The package path that you import may not directly correlate to a repository URL. If Go can not find the package in your GOPATH, it will load the URL, redirecting as necessary, and will either look for a “meta” tag with a “name” attribute having value “go-import” or take whatever URL we are at when no more HTTP redirects are necessary (if any).

To speed things up, I wrote a short Python tool to do this and stashed it in a gist.

Make sure to install the dependencies:

  • requests
  • beautifulsoup4

Source:

import sys
import argparse
import urllib.parse

import requests
import bs4

def _get_cycle(url):
    while 1:
        print("Reading: {}".format(url))

        r = requests.get(url, allow_redirects=False)
        r.raise_for_status()

        bs = bs4.BeautifulSoup(r.text, 'html.parser')

        def meta_filter(tag): 
            # We're looking for meta-tags like this:
            #
            # <meta name="go-import" content="googlemaps.github.io/maps git https://github.com/googlemaps/google-maps-services-go">
            
            return \
                tag.name == 'meta' and \
                tag.attrs.get('name') == 'go-import'

        for m in bs.find_all(meta_filter):
            phrase = m.attrs['content']
            _, vcs, repo_url_root = phrase.split(' ')
            if vcs != 'git':
                continue

            return repo_url_root

        next_url = r.headers.get('Location')
        if next_url is None:
            break

        p = urllib.parse.urlparse(next_url)
        if p.netloc == '':
            # Take the schema, hostname, and port from the last URL.
            p2 = urllib.parse.urlparse(url)
            updated_url = '{}://{}{}'.format(p2.scheme, p2.netloc, next_url)
            print("  [{}] => [{}]".format(next_url, updated_url))

            url = updated_url
        else:
            url = next_url

    return url

def _main():
    description = "Determine the import URL for the given Go import path"
    parser = argparse.ArgumentParser(description=description)
    
    parser.add_argument(
        'import_path',
        help='Go import path')

    args = parser.parse_args()

    initial_url = "https://{}".format(args.import_path)
    final_url = _get_cycle(initial_url)

    print("Final URL: [{}]".format(final_url))

if __name__ == '__main__':
    _main()

Best Argument-Processing for .NET/C#

After trying NDesk.Options and Fluent, I am nothing but impressed with CLAP (“Command-Line Auto-Parser”). It completely relies on reflection and parameter attributes (usually just one or two) to automatically marshal your values, assign defaults, enforce requiredness, and provide command-line documentation. It’s beautiful and, so far, flawless. Well done.

using CLAP;

namespace MyNamespace
{
    class Program
    {
        [Verb(IsDefault = true, Description = &quot;Print the current version of the given package and, optionally, increment it.&quot;)]
        public void Version(
            [Description(&quot;Project path&quot;)]
            [Required]
            string projectPath,

            [Description(&quot;Package name&quot;)]
            [Required]
            string packageName,

            [Description(&quot;Base version to increment from (if lower than current, else use current)&quot;)]
            string baseVersion = null,

            [Description(&quot;Increment the version before returning&quot;)]
            bool increment = false
        )
        {
            // ...
        }
    }
}

If you don’t decorate with the “Required” attribute and don’t provide a default value the parameter will default to null. I explicitly set baseVersion to a default of null because I prefer being explicit.

Converting TFS 2015 Build Definition Names to Their Directory IDs

A TFS 2015 agent maintains a directory of build-space directories, where each corresponds to a single build definition. Each of these has a source-directory (“s”; usually automatically checked-out), binaries-directory (“b”; for intermediate binaries prior to publish-oriented cherry-picking), and artifact-staging directory (“a”; for artifacts to be published).

Not only is each of the build-space directories an integer with no clear mapping to the build-definition but this integer is different from one agent to the next. It turns out that there is a meta-information directory whose children are collection GUIDs. Furthermore, the children of those directories are additional directories with integer names. Each of these has a JSON file that contains build-definition information.

I quickly wrote a Python script to search for a given build definition and print the ID. The basic functionality is split into succinct, easily-callable methods for whatever other tasks you might have.

import logging
import os.path
import json

_LOGGER = logging.getLogger(__name__)


class Tfs(object):
    def __init__(self, agent_path):
        self.__agent_path = agent_path
        self.__srm_path = \
            os.path.join(
                self.__agent_path, 
                '_work', 
                'SourceRootMapping')

        _LOGGER.debug("Agent SRM path: [%s]", self.__srm_path)

    def collection_gen(self):
        for filename in os.listdir(self.__srm_path):
            filepath = os.path.join(self.__srm_path, filename)
            if os.path.isdir(filepath) is False:
                continue

            _LOGGER.debug("Collection GUID: [%s]", filename)
            yield filename, filepath

    def definition_gen(self, collection_path):
        for srm_id_raw in os.listdir(collection_path):
            _LOGGER.debug("SRM ID: [%s]", srm_id_raw)

            definition_info_filepath = \
                os.path.join(
                    collection_path, 
                    srm_id_raw, 
                    'SourceFolder.json')

            with open(definition_info_filepath) as f:
                yield json.load(f)

    def lookup_definition(self, definition_name):
        for collection_guid, collection_path in self.collection_gen():
            for definition in self.definition_gen(collection_path):
                current_definition_name = definition['definitionName']
                _LOGGER.debug("Definition: [%s]", current_definition_name)

                if current_definition_name != definition_name:
                    continue

                return definition

        raise ValueError("Could not find definition: {}".format(
                         definition_name))


def _configure_logging():
    sh = logging.StreamHandler()
    formatter = logging.Formatter('%(asctime)s [%(name)s %(levelname)s] %(message)s')
    sh.setFormatter(formatter)

    _LOGGER.addHandler(sh)
    _LOGGER.setLevel(logging.DEBUG)

if __name__ == '__main__':
    import argparse

    parser = argparse.ArgumentParser()

    parser.add_argument(
        'agent_path', 
        help="Agent path")

    parser.add_argument(
        'definition_name', 
        help="Build-definition name")

    args = parser.parse_args()

    _configure_logging()

    t = Tfs(args.agent_path)
    definition = t.lookup_definition(args.definition_name)

    print(definition['agent_builddirectory'])

Output:

D:\development\python>python tfs.py c:\tfs_build_agent ConsoleProject.Dev
2016-07-07 23:32:02,756 [__main__ DEBUG] Agent SRM path: [c:\tfs_build_agent\_work\SourceRootMapping]
2016-07-07 23:32:02,756 [__main__ DEBUG] Collection GUID: [2cf8d3cb-b8d4-49e1-bbdb-2aacf02f48c4]
2016-07-07 23:32:02,757 [__main__ DEBUG] SRM ID: [1]
2016-07-07 23:32:02,757 [__main__ DEBUG] Definition: [ConsoleProject.Dev]
1

Important Notes

  • The child directories of the SourceRootMapping directory will not exactly reflect the current build definitions. Some of them may represent build definitions that no longer exist.

  • One of the current work directories may be using an ID used by an old build-definition at some point in the past. So, if you are building a dictionary of work-directory IDs to build-definitions, you will have to use the build-definition’s ID or one of the timestamps described in the its JSON file to reconcile the latest build-definition to use that directory.