Git: Grafting Commits (Simple And Sweet)

There are a lot posts sharing many degrees of complexity of how to rebase/graft/replace a bunch of commits, but this is the process given the simplest case:

git rebase --onto <new parent> <old parent>

Given the parent commit of the series of commit, move them to a different parent.

So, given these commits under BRANCH1:

AAA (HEAD)
AAB
AAC
AAD

..and these commits under BRANCH2:

BBA
BBB
BBC
BBD

…in order to place AAA->AAB->AAC on top of BBA->BBB->BBC:

git rebase --onto BBA AAD

Standard rev-parse rules apply (you can use branches/tags instead of revisions).

Enforcing Complex Passwords In Manjaro

Manjaro has a weird password-quality setup. The password-quality functionality is provided by the libpwquality package, which provides a PAM plugin named pwquality.so and configuration at /etc/security/pwquality.conf . However, once installed it seems to still not be referenced by the PAM configuration and will not be applied to password changes.

Before making changes, open a root prompt. You will make your changes here and have a place to restore your configuration from if you break anything. Make a backup of /etc/pam.d/system-auth .

Make sure you have the libpwquality package installed.

Open /etc/pam.d/system-auth . There are PAM configurations that deal with authenticating the user and PAM configurations that deal with updating passwords. Look for the password-setting configuration block; those lines will have “password” as the module-type (first column):

-password  [success=1 default=ignore]  pam_systemd_home.so
password   required                    pam_unix.so          try_first_pass nullok shadow
password   optional                    pam_permit.so

Insert the line for pam_pwquality.so above (always above) pam_unix.so, and update the configuration for pam_unix.so to:

password requisite pam_pwquality.so retry=3
password required pam_unix.so try_first_pass nullok shadow use_authtok

The change will apply immediately, though you probably will not see a difference with the default password-quality configuration settings.

Make your desired changes to /etc/security/pwquality.conf . We recommend the following settings:

minlen = 10
dcredit = -1
ucredit = -1
lcredit = -1
ocredit = -1

Open the shell for a nonprivileged account that you would like to use to test password changes and proceed to test the password-quality options that you enabled in order to ensure they work as expected. If you made a mistake, fix them in the root console that you should still have open.

Important note: Unless you have enabled “enforce_for_root” in the password-quality or PAM configuration, you will only see advisory warnings for nonconformant passwords when running as root. You will still be able to set any password you’d like.

Retrieving Your Shopify Store’s GraphQL Schema

Using Python and SGQLC with Shopify’s latest API (2025-01):

$ python3 -m sgqlc.introspection https://<subdomain>.myshopify.com/admin/api/2025-01/graphql.json -H "X-Shopify-Access-Token: <secret>"

{
  "data": {
    "__schema": {
      "directives": [
        {
          "args": [
            {
              "defaultValue": "null",
...

Incidentally, Shopify’s Python client establishes the schema’s URL here:

https://github.com/Shopify/shopify_python_api/blob/f58c9910e5699fbce950766772c6ac30634f85ea/shopify/resources/graphql.py#L9

Register New Executable Formats In The Linux Kernel

If the kernel identifies that a file you’re trying to execute has a known sequence of magic bytes, it will happily execute it. However, you might want to run certain formats without needing to pass them to another tool. In as much as these are usually binary tools, you can prepend a magic-bytes preamble (or use an existing sequence, if your binary format already declares one) and then associate that with a certain loader in the running kernel using binfmt_misc. Access is usually via procfs (currently /proc/sys/fs/binfmt_misc/register). Cross-execution with a virtualized process is a common use-case for this.

Note that this doesn’t really apply to text files (it’s all data) because you’d need to inject gibberish at the top of your file to use this functionality when using a shebang (which is meant for this case) should work just fine.

This is a walkthrough of how to do this for compiled Python or LUA source-code: https://twdev.blog/2024/01/docker_multi_platform

Since both of the example compilation formats include magic bytes, you can just write a tool to acquire those from libraries (at least with Python) and call binfmt_misc in one step.

You can also write configs to be automatically loaded at boot. There appears to be no way to list registered formats, with the only recourse being to sift the config paths (probably empty by default) or to check your kernel config to see which supported formats were included in that build.

Run Your Binary With Diagnostic Traces And Blinking Lights Like A Classic Mainframe

Blinkenlights

From the overview:

Computers once had operator panels that provided an intimate overview of the machine’s internal state at any given moment. The blinking lights would communicate the personality of each piece of software. Since our minds are great at spotting patterns, developers would intuitively understand based on which way the LEDs were flashing, if a program was sorting data, collating, caught in an infinite loop, etc. This is an aspect of the computing experience that modern machines haven’t done a good job at recreating, until now.

Check YAML Dictionary Key Uniqueness with PyYAML

If you are dealing with very large YAML files that are curated over time by hand, it is not inconceivable that someone will inadvertently introduce a duplicate. The problem is that, as PyYAML is just dutifully enumerating the nodes and loading a dictionary and has no requirements, knowledge, or authority to do anything else, you will already have lost the duplicates by the time you’ve received the dictionary back.

This implementation is only concerned with dictionaries under duplicate keys, and not integers, strings, lists, etc.. under duplicate keys. Note the corresponding comment. It was unnecessary in my situation and something you’ll have to account for if/when modifying this routine for your purposes.

The following code overrides the YAML loader and the map construction to do this. This source-code is also available as a gist.

import yaml

def load_and_assert_uniqueness(x):

    # We'd like to detect duplicates. Since PyYAML both loads things depth-first
    # *and* doesn't give us the parent when processing a child node, we'll index
    # of all of the object IDs as we're constructing them, and then see which
    # are disappeared from the final hierarchy. Since all we can do is pass a
    # class, we need to inline the class in order to load into an index within
    # our scope.
    #
    # We're only concerned about dictionary keys with dictionary values because
    # a) this is our use-case, and b) we can stash additional information as
    # dictionary keys without having to override any types.

    nodes_by_id = {}


    class _UniqueCheckedLoader(yaml.SafeLoader):

        def construct_yaml_map(self, node):
            data = {}

            id_ = id(data)
            data['_id'] = id_
            nodes_by_id[id_] = data

            yield data

            value = self.construct_mapping(node)
            data.update(value)


    _UniqueCheckedLoader.add_constructor(
        'tag:yaml.org,2002:map',
        _UniqueCheckedLoader.construct_yaml_map
    )


    # Load

    blob = yaml.load(x, Loader=_UniqueCheckedLoader)


    # Remove all nodes in the final dictionary from the by-ID index

    q = [blob]
    while q:
        d, q = q[0], q[1:]

        id_ = d.pop('_id')
        del nodes_by_id[id_]

        for v in d.values():

            # We're only concern with dictionary nodes
            if v.__class__ is not dict:
                continue

            q.append(v)


    # We've visited all referencesd nodes. Everything still indexed must've been
    # pruned due to nonuniqueness. As mentioned above, we really don't have any
    # hierarchical context, by we can just search out occurrences of the
    # attributes from the node(s) in the data in order to find the duplicates.

    if nodes_by_id:

        # Cleanup representation before displaying

        nodes = []
        for node in nodes_by_id.values():
            del node['_id']
            nodes.append(node)

        # Error out

        raise \
            Exception(
                "({}) nodes were duplicates:\n{}".format(
                    len(nodes), nodes))


    return blob

Embedding Python in PostgreSQL Functions

Example:

CREATE OR REPLACE FUNCTION url_quote (url text)
RETURNS TEXT
AS $$
    from urllib.parse import quote
    return quote(url)

$$
LANGUAGE 'plpython3u';

SELECT url_quote('https://www.postgresql.org/docs/12/plpython-data.html#id-1.8.11.11.3');

Getting Started with Postgres Functions in PL/Python

How to Render Django Templates Without Loading Django (or Its Configuration System)

#!/usr/bin/env python3

import django.template
import django.template.engine

def _main():
    e = django.template.engine.Engine()

    body = """\
aa {{ test_token }} cc
"""

    t = django.template.Template(body, engine=e)

    context = {
        'test_token': 'bb',
    }

    c = django.template.Context(context)
    r = t.render(c)

    print(r)


_main()

Output:

$ ./test_render.py 
aa bb cc

AWS: Adding a new MFA device says “This entity already exists” or “MFA device already exists”

A team-member was trying to register a new MFA device in AWS, and was being told that they already had one registered:

However, their account claims that none are registered:

However, it looks like AWS might show an empty list when it shouldn’t when the user has started the process but was interrupted from completing it. Use the AWS CLI “list-virtual-mfa-devices” subcommand to enumerate the current MFA devices:

$ aws iam list-virtual-mfa-devices
{
    "VirtualMFADevices": [
        {
            "SerialNumber": "arn:aws:iam::326764833890:mfa/karan"
        },
        {
            "SerialNumber": "arn:aws:iam::326764833890:mfa/rachel"
        },
        {
            "SerialNumber": "arn:aws:iam::326764833890:mfa/sarah.benhart"

Now, remove the problematic one using the corresponding SerialNumber value:

$ aws iam delete-virtual-mfa-device --serial-number <SerialNumber value>

You will now be able to restart the process with them. Make sure to have them remove any existing entries in their app so they don’t get confused.