I use the following script to run a daily upgrade on the #BookStack instance:
git pull origin release && composer install --no-dev && php artisan migrate --force
I'm looking to create a cross-platform executable of my commandline application. The application is using
poetry as the package manager, and I'm running this on MacOS.
First we need to install PyInstaller:
$ poetry add pyinstaller --dev
Then it should be as simple as running the following:
$ poetry shell
$ pyinstaller --onefile app/app.py
However, this resulted in the following error:
ModuleNotFoundError: No module named 'macholib'
macholib can be used to analyze and edit Mach-O headers, the executable format used by Mac OS X.
It’s typically used as a dependency analysis tool, and also to rewrite dylib references in Mach-O headers to be @executable_path relative.
Manually adding this dependency to the project addresses this:
$ poetry add macholib --dev
However, now calling pyinstaller again throws up a new error:
Unable to find "nltk_data" when adding binary and data files.
NLTK is a leading platform for building Python programs to work with human language data. It provides easy-to-use interfaces to over 50 corpora and lexical resources such as WordNet, along with a suite of text processing libraries for classification, tokenization, stemming, tagging, parsing, and semantic reasoning, wrappers for industrial-strength NLP libraries [...].
Fortunately we can download the missing data as follows (from within the poetry shell):
>>> import nltk
Press the download button on the window that pops up and make sure the installation path matches the error message path.
Then fix the path to NLTK in PyInstaller according to this StackOverflow answer. To get to the right location, ask poetry:
$ poetry env list --full-path
$ cd <path>/lib/python3.7/site-packages/PyInstaller/hooks
Rerunning the pyinstaller command after this produced an executable! However it's 1.2GB, propably due to including the macholib library.
I will update this post when I've figured this out.
I just released Fafi v0.1.5-alpha. Fafi is a console application that indexes and searches the page content of Firefox bookmarks.
pipx install fafi
v0.1.5 stores the database into it's own application settings (using https://
I run my own Nextcloud server, and every other week there's an update to the server, or one of the plugins I've enabled. The following steps enable automatic updates to the Nextcloud server.
This assumes Nextcloud is installed under
Add a scheduled task as the webserver user:
$ sudo -u www-data crontab -e
Add the following line to run the upgrade script once a day at 4:05am. Instead of running the upgrade process directly we run a script so that we can also run it from the shell if needed:
5 4 * * * cd /var/www/nextcloud && ./upgrade.sh
Now create the script and make it executable:
$ cd /var/www/nextcloud
$ nano upgrade.sh && sudo chmod +x upgrade.sh
The script itself runs the server updater, the no-interaction argument prevents prompting for questions, followed by the occ utility to update all installed apps.
pushd $(dirname $0)/updater
/usr/bin/php updater.phar --no-interaction
cd ..; ./occ app:update --all
Over the holidays I’ve build a little tool, Faff, to index and search the page contents of Firefox bookmarks. This allows searching using words that appear on the pages rather than in the bookmark title. It uses full-text-search with ranking / relevance and snippets, it’s quite WIP. More info at https://
It's written in Python command-line tool and uses SQLite's full-text search and Newspaper's text extraction, so a search over all my bookmarks takes only about 0.3 seconds although the indexing is certainly slow.
Cuttlefish is a PHP based hackable blog framework -- with the goals of being fast, easily hackable, and easy to adopt. I've been working on it since 2012, when it was known as Carbon. It can generate a static HTML site for uploading anywhere, or run dynamically.
Version 0.4 licenses the code as MIT, so anyone can build on top of the project. Cuttlefish now has API documentation courtesy of PHPDox, which is updated whenever code is changed. I've changed the code style from 'WordPress-like' to the PHP community default of PSR12. The project now comes with a Docker container which means getting up and running is even easier.
Install Cuttlefish is easy using the instructions. For a fuller list of changes see https://
Known issue: I still have trouble getting Xdebug to work, if you're familiar with Docker Compose and Xdebug I could use your help.
For v0.5, now that the codebase is in a better state, I'm looking at adding more features again.
Add the following lines to ~/.gitconfig to load configuration only for repositories within a certain location:
path = ~/dev/work/.gitconfig
For example, this can be used to set work email and signing keys.
If you follow this blog you will have noticed that I've commented multiple times on needless distractions that seem to have pervaded modern computing. Today, let's present a few solutions.
Firstly, when in the flow of doing deep work, watching animations delay your actions can be a source of frustration. While we can't speed up GitHub, we can speed up macOS. In the accessibility settings, check Display > Reduce Motion. This will speed up the interface and Mission Control animations. If you miss the garishness, you can turn it back off but chances are you will notice the system not getting in the way as much.
Secondly a tip for fellow Homebrew users. When you're ready to work through that difficult project tooling setup, it can be the worst time for Homebrew to decide to update it self, especially as this can take up to half a minute depending on how far behind your version is. Not now Homebrew! If you follow the instructions at Homebrew Autoupdate you can setup Homebrew to update itself in the background.
Finally, if you're using oh-my-zsh or bash equivalent, there will be a setting for it to update itself without prompting. Edit
.zshrc and add
DISABLE_UPDATE_PROMPT="true". I've not had any issues.
Cheers to a focused work experience!
Unless you're using DNS over HTTP (DoH), you can speed up general DNS requests by running a local DNS proxy, and increase the expiry time of DNS queries. I'll go into this further once I've updated this post for DNSMASQ to do DoH.
The following configuration will speed up browsing in Safari for example.
brew install dnsmasq
Load all configs from /etc/local/etc/dnsmasq.d/:
echo "conf-dir=/usr/local/etc/dnsmasq.d,*.conf" | sudo tee —append /usr/local/etc/dnsmasq.conf
mkdir -p /usr/local/etc/dnsmasq.d
# Tell dnsmasq to get its DNS servers from this config file only.
# Add router dns
# cache for 1h
min-cache-ttl = 3600
Start DNSMASQ on boot and launch it:
sudo brew services start dnsmasq
dig cnn.com @127.0.0.1
Query time should be 0 the second time and an ANSWER SECTION should be returned. If that is the case open System Preferences > Network > Advanced > DNS > +
Enter: 127.0.0.1 and hit OK > Apply.