Notes

systemd-run | systemd project

systemd touts itself as a set of building blocks for making a sane Linux system. This means it has lots of really useful features like systemd resource 'slices', automatically logging and restarting 'services', and much more.

One of the minor inconveniences is that you must set all of this up before hand, like it were a configuration, through things like '.service' files.

Actually, systemd provides a utility called systemd-run that allows for creation of units- whether they be socket or service or anything else- on the fly for the session. This allows for leveraging all of your pre-setup systemd functionality for anything you decide to use today, and that can be quite useful.

One very good use of is creating background jobs that are logged and re-runnable centrally, say a program like Steam is going to be start up, and you want to keep track of its output. Well, systemd-run can be used to create a service and drag its log from the journal at any time, and the logs are persistent by default. Through this, doing something like systemctl status --user steam will show what has gone on with the program and if its still running, as well as some of the captured standard output.

A basic usage of the command would be systemd --user --unit <new_unit_name> <command_to_run> <any_command_arguments> In practice: systemd --user --unit steam steam -silent -console Where --user instructs systemd to manage user-level services, rather than systemd ones, which allows for rootless units.

pee | moreutils

tree

Making a tree from piped files

You can make tree read from the pipe using the --fromfile . argument, where . is a special value that means to read from the pipe. The invocation is obviously a bit esoteric, but it works.

You can chain this with something like GNU find or fd to create a tree output from matched files.

xargs

Substitute

xargs can substitute the data from the point into 1 place in 1 command using the I <replacement_identifier> command. For example. echo "hello" | xargs -I{}" world" Would produce hello world

Lua Language

-- Produce a proxy table that is a read-only version of a table
local function const(original_table)
   return setmetatable({}, {
         __index = original_table,
         -- Table updates
         __newindex = function(table, key, value)
            safe_error("Attempt to modify read-only table")
         end,
         __metatable = false
   });
end

git-bug

git-bug is a new decentralized, terminal focused issue tracking workflow, written in Go.

It has the advantage of being embedded directly in the git repository un-intrusively

C++ Language | cpp

[ [nodiscard] ] | Warning about unused return values | Since C++17

Sometimes a function really shouldn't be used without taking the return value, things like memory allocations is a good example. C++ provides the [[nodiscard]] hint for encouraging the compiler to emit warnings when the return value is unused.

Compiler Sanitization Checks | Runtime Error Checking

A good few compilers offer sanitization checks like clang's -fsanitize=memory, these basically exist to inject checks into the program like out of bounds access and stack corruption, these can be a better way to check for bugs than manually debugging or writing error checking code, at the cost of some runtime performance and compile time.

Generally speaking, this should be worth it for debug builds.

References: https://clang.llvm.org/docs/AddressSanitizer.html

Banning Identifiers | GCC Poison | MSVC Depreceated

Sometimes you want to fully deprecate symbols and identifiers. For this, you have the GCC poison pre-processor directive, even though this says GCC, a good few other compilers support it as well.

These macros will throw an error and stop compilation if it sees any of these identifiers are used in the current translation unit.

for example

#pragma deprecated (strcpy, strcpyA, strcpyW, wcscpy, _tcscpy, _mbscpy, StrCpy, StrCpyA, StrCpyW, lstrcpy, lstrcpyA, lstrcpyW, _tccpy, _mbccpy, _ftcscpy)

#pragma GCC poison strncpy wcsncpy _tcsncpy _mbsncpy _mbsnbcpy StrCpyN StrCpyNA StrCpyNW StrNCpy strcpynA StrNCpyA StrNCpyW lstrcpyn lstrcpynA lstrcpynW

Preventing Nested Class Hell

When writing modern C++ code a common thing that comes up over and over again is the misuse of classes.

One such misuse is very long inheritance chains, where one class derives from another class, which is derives from another, and so on and so fourth.

Inheritance isn't bad in itself, but when dealing with more than roughly 2 previously inherited classes, the logic of the program starts to break down very rapidly, and becomes very confusing and can have surprising behavior, as well as having language brought about gotchas. For this reason, it might be advisable to forcefully limit the inheritance chain length.

If you are writing a derived class, and you have already inherited from 2 classes, you can declare the class as final to prevent overriding of virtual functions.

class test final
{};

Suggested alternative to inheritance, prefer composition, defining classes as variables in the class body and using entity-component-systems (ECS) to structure complex objects. It is much more flexible, simple, and powerful. As well as easier to understand.

Check if binary has debug symbols | readelf

Sometimes you may want to verify that a binary has debugging symbols. GNU binutils (required by gcc) provides a readelf command that has methods for finding this out.

readelf --debug-dump=decodedline <binary_file>

If the binary has debugging symbols, you will get a large output on screen, with things like the source file location the line number the starting address.

Ghetto Chrono Profiling

// Early in file setup
#include <chrono>
using namespace std::chrono;
using namespace std::chrono_literals;
// ...<Before the start of program loop>
time_point programEpoch = system_clock::now();
time_point lastFrameEnd = programEpoch;
time_point frameStart   = programEpoch;
time_point lastSecondTime = programEpoch;
int frameCounter = 0;
float framerate = 0;
float framesPerSecond = 0;
milliseconds frametime = milliseconds(0);
milliseconds timeSinceLastSecond = milliseconds(0);
// <Start of profiled loop>...
frametime = duration_cast<milliseconds>(lastFrameEnd - frameStart);
frameStart = system_clock::now();
timeSinceLastSecond = duration_cast<milliseconds>(frameStart - lastSecondTime);
if (timeSinceLastSecond > 1s)
{
    // 1 second average
    framerate = frameCounter;
    frameCounter = 0;
}
// Prolfing End
// ...<End of profiled loop>
// Profiling
++frameCounter;
lastFrameEnd = system_clock::now();

std::cout performance hacks

std::cout.tie(nullptr);
std::ios_base::sync_with_stdio(false);

consteval | C++ 20

<> consteval is a new keyword in C++20 standard that aims to improve compile time programming by making up for the shortcomings of constexpr.

A normal constant expression dictates that everything included expression must be directly produce a constant, compile time value. In practice this means that everything in the expression must return a similar compile time expression. This prevents chaining of constant expression functions.

consteval improves on this by allowing Uncertain. needs finishing [2021-09-25 Sat 14:53]

std::cout will use scientific notation for extremely large numbers

Not using std::exception& in a catch statement will cause object-slicing

std::ifstream reads files in chunks

This likely isn't that performant if the contents isn't immediately copied, and even then, it is liable to make many filesystem requests and perform random IO instead of sequential. It might be a good idea look into tracing syscalls to find out more.

cut | GNU Coreutils

Bash

Logical Statements as control flow && || ;

It is common to see the && sign to run 2 commands one after another, but the actual meaning is a little more nuanced in that and it's both important and useful to know about them.

A 'list', is considered a group of chained commands, commands chained using control flow operators ~&&~ is considered an 'AND' list operator when used between commands This will only run the second command if the first one

Treat unset options as errors with set -o nounset

Getting issues with variables being unset can be problematic, putting this option near the top of a bash script forces the script to throw an error if the variable being read is not set.

LDoc

A lua documentation generation tool

cat | GNU Coreutils

cat is a utility which concatenates multiple files to stdout It is also often used to reading short files in the terminal, although, less might be more suited to that.

moreutils

More utilities is designed as an extension to the GNU coreutils, implementing various common command line utilities that "nobody thought to write 30 years ago".

Markdown

Colouring Text

Markdown itself doesn't support coloured text, but it is designed to be used on the web. This means you can inline HTML that can change the colour. like so <span style="color:blue">some *blue* text</span>.

git

List tracked files

git ls-tree -r master --name-only <path_name> -r - means recursive --name-only - only show filenames, not object ids and types <path_name> - show tracked files in this directory

Or, this one liner to list any file ever in the object tree. git log --pretty=format: --name-only --diff-filter=A | sort - | sed '/^$/d'

Pushing Habits

Reflex Push When working with remote repositories, it is ill-advised to habitually push every single commit. Trying to fix commits after they have been pushed upstream is one of the most painful things you can do with Git, and is best avoided.

For this reason, it's best to keep small numbers of non-time-critical commits locally, so that you are sure you haven't made a silly mistake before pushing.

If for some inexplicable reason you need to push, say to use upstream as a cloud repo, just make a new branch, nobody is going to mind a trash branch that gets deleted, and it'll stick around in the reflog.

But people will mind if you push a trash commit to a good branch, or worse. Master. Anything that other people will pull, don't be haphazard or clumsey with.

Managing repos with ssh

For GitHub you can setup to use your pre-existing ssh authenticated key by changing or adding a remote that uses a git connected URL, similar to the following git remote set-url origin [email protected]:mallchad/cemacs.git git remote add ssh [email protected]:mallchad/cemacs.git

Checkout a remote branch

Checking out a remote branch is actually impractical, since the point of git is to always be working. A better approach would instead be to make a local copy of a branch and checkout that.

First, ensure you have fetched the remote branch (often 'origin') and it is being tracked. git fetch origin Then create the branch and checkout to it using the following command git checkout -b <branch> origin/<branch>

In more modern versions of git, you might get away with a 'dwim' style command that just guesses you want to try to fetch something from upstream. git checkout <branch>

Showing changes

git diff is the command to show changes between two file trees. By default, it will compare the work tree to the HEAD, the most recent, referenced commit.

To see the changes in the stage, use the command git diff --stage

Quickly adding files

If you know the exact name of a file you can add it with a glob expression, say you want to commit a file magic_cookie.patch in a really deep folder. You can simply run something similar to git add "*magic_cookie.patch*" Assuming no other file exists with a similar name, it will add that exact file, no fuss.

Alternatively, if you know the exact folder your change is, and you verified there are no other changes in that folder, you can simply reference the folder name, for example git add parent/folder/of/magic_cookie

Restoring Work

Simply run git restore <path_to_restore> to reset it to the HEAD Although, it should be noted git considers this an experimental feature

Missing Work Tree

Sometimes you might an encounter a situation where git complains about there being no tree, in this case you may have to reset the GIT_WORK_TREE enviornment variable to get things working again

This may also help in situations

Credentials Tips

Typing in your credentials every time can be a hassle. Fortunately git provides a method of caching credentials in memory, run the following command to cache memory for 3600 seconds (1 hour aprox.) git config --global credential.helper 'cache --timeout=3600'

You can also use the store option to cache credentials on disk. This may be tempting, but do NOT do it, it will store your password in plain text.

ssh is probably a better alternative Also for github you can generate access tokens which can be revoked at any time, but again, don't share or store this in plain text

Signing

Git supports signing tags with GPG signatures, this can help with security by better guaranteeing that something is from the original author, as well as verifying the author is who they say they are.

More recent versions of git supports signing individual commits To sign a commit, simply add the -S flag when commiting

GNU Emacs

xref

xref is a powerful and flexible mechanism to allow for rapid navigation of source code. xref-find-definitions one of the most useful functions, will try to determine the definition of the symbol under the point, and then jump to it, if the definition appears to be ambiguous, you will be prompted with a list of definitions to jump to.

One you have visited a place with xref, you can quickly jump back with xref-pop-marker-stack

Emacs Lisp

Fundamentals | Reading Material

Emacs Manual elisp Buffer Contents

Macros

A macro is a function-like object that takes and returns an elisp expression instead of values and symbols. They are defined with the defmacro macro. Arguments passed are unevaluated before being replaced in the macro body.

This can be useful for using macros as shorthand helper functions that can express otherwise verbose or complex elisp structures. The downside is arguments are unevaluated before being replaced.

Emacs lisp macros suffer from the same problem as most other macros in languages, their use case is niche and their nature leads to confusing, even downright dangerous behavior. One common issue with macros is variable eclipsing, where a symbol's value is overridden by another. This is confusing and problematic to deal with, since the return value is a lisp expression, not a value, the contents of the expression could easily contain variable definitions of its own.

Because of this it is advisable to treat the macro as what it is, an expansion and replacement function, and try to do minimal, or no logic within the macro itself, and just immediately expand it into the desired form, and let the lisp expression do the heavy lifting.

When using forms, you can use backtick list building to evaluate the form in place.

Basic, but useful functions

line-beginning-position - return buffer position for the beginning of the line line-end-position - return buffer position for the beginning of the line char-before - return char before point or POS char-after - return char after point or POS skip-chars-foward - move the point forwards based on regex matches skip-chars-back - move the point backwards based on regex matches fboundp - return t if a symbol's function definition is not void buffer-string - returns the entire contents of the accessible buffer buffer-substring - returns a string of the buffer contents between 2 points goto-char - move the point to a new position search-string - move the point to a position matching a supplied string number-sequence - return a sequence from between two numbers current-word - returns the symbol or word object near the point insert - insert strings or characters into the buffer after the point insert-char - insert 1, optionally repeated into the buffer after the point buffer-list - return the buffer list, a list of open buffers for this session switch-to-buffer - display a buffer in the selected window window-buffer - return the buffer the a window is displaying with-current-buffer - Evaluate treating another buffer as current temporarily generate-new-buffer - Create a new named buffer generate-new-buffer-name - Create a new name for a buffer that isn't already taken

car - return the first element (the car) of a list cdr - return a list of elements following the first element of a (the cdr) cadr - return the car of the cdr of a list (the second element) cddr - return the cdr of the cdr of a list (the list following element 1 (0 indexed)) setcar - modify the value of the car of a list setcdr - modify the value of the cdr of a list assoc - return the cell of an associative list whose car equal's a key rassoc - return the cell of an associative list whose cdr equal's a key assq - return the cell of an associative list whose car eq's a key rassq - return the cell of an associative list who cdr eq's a key ignore-errors - Evaluate a form, returning on error without throwing make-symbol - Create named, non-interned non-global symbol intern - Return a named symbol, creating one in the 'obarray' not found current-column - Return the column under the point move-to-column - Move the point to the specified column

ielm - An interactive lisp interpreter

This is a useful way of executing emacs lisp expressions and seeing the result This is an alternative to shell and evaluating regions

if statement

In emacs lisp the 3rd argument is considered the "else" statement when writing if statements. If you want to run multiple statements in an if statement you need to wrap it in some kind of functions. The most common is progn which evalutes all arguments sequentially and returns the last value.

GPG -Gnu Privacy Guard

Searching for keys

GPG supports easily searching and importing keys based on names and emails and partial keys. Simply run the following gpg --search-keys <key_or_name_or_email> After you will be prompted what key to select or inspect. Optionally, specify a different keyserver for the operation, gpg --keyserver https://keyserver.ubuntu.com --search-keys <key_name_or_email>

Keyserver Issues

For some reason there seems to regular issues relating to the keyservers in use.

As it turns out, as of [2021-09-12 Sun 13:26], the most usable keyserver is https://keyserver.ubuntu.com And, in fact, this keyserver regularly synchronized with the old SKS pool network. https://pgp.mit.edu is another fairly reliable old service. https://keys.openpgp.com is a more recent keyserver, but does not synchronize with any other key-server, this particular one will not public personal information like email addresses without publisher consent (verification too).

Refreshing Expiry

Periodically you want you extend the expiry on a GPG key, the typical way is pretty slow and cumbersome, however, if you are within expiry, you can use the "quick" switch like so. gpg --quick-set-expire <key_id> <period_or_date> Or, you can apply the extension to a glob of, or all subkeys, for example gpg --quick-set-expire 3CAD89B9404A33AFDCA704E68358A164 '*'

Alpine Linux

Dev Tool Metapackage alpine-sdk

This contains the most commonly used buildtools like gcc and make.

CMake

Just Run a Clean

Using the cmake build tool you can specify that you want to run the clean target, simply run a command like the following cmake --build build --target = clean Where 'build' is the build directory containing the CMake cache files

Clean Before Build

Using the cmake build tool you can specify that you want to run clean targets first. For this, simply run a command like the following cmake --build build --clean-first Where 'build' is the build directory containing the CMake cache files

Default Value Variables

It is quite common that you might want to externally write to the CMake cache, but providing a default value if not specified. There is a simple, albeit verbose mechanism to achieve this.

set(FERINGBUILDDIR ${CMAKEBINARYDIR} CACHE STRING "A custom path for the build directory")

Build Directory Suffixs (Debug/Release)

Some build generators like msbuild and xcode like to manipulate the output directores and append a suffix like "/build/Debug". This can be quite annoying, especially when dealing with many toolchains, and not being able to easily determine where the output for the build will actually land. Strangely enouhg CMake does have a work around for this, in the form of cmake generator expressions.

Why they work I don't know. Black magic I say.

But if you add the generator expression somewhere to the relevant paths, it will allow you to modify the true output location.

For example:

set_target_properties(fering PROPERTIES
  # Output Direcorties
  # The '$<0:>' is a cmake generator expression to prevent multi-target build
  # generators like Microsoft Visual Studio from manipulating output directories
  RUNTIME_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/bin$<0:>
  ARCHIVE_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/artifacts<0:>
  COMPILE_PDB_OUTPUT_DIRECTORY ${CMAKE_BINARY_DIR}/debug<0:>)

Source c++ - In CMake, how do I work around the Debug and Release directories Visual…

Parallel Builds

Not all build systems parallelize the build out of the box, however, with a little bit of code you can make more build systems parallelize the build by default.

include(ProcessorCount)
ProcessorCount(CPU_CORES)
if(NOT CPU_CORES EQUAL 0)
  set(CMAKE_BUILD_PARALLEL_LEVEL ${CPU_CORES})
endif()

ccache support

With a simple trick ccache can be enabled when it is found. The following source shows one method to detect and use ccache

find_program(CCACHE_FOUND ccache)
if(CCACHE_FOUND)
    set_property(GLOBAL PROPERTY RULE_LAUNCH_COMPILE ccache)
    endif(CCACHE_FOUND)

curl

curl is a useful gnu coreutils tool for interacting with the web through the command line. It is most commonly used for downloading files from the internet. If you wish to do this, and do not care to change the name, use the -O flag to use the remote's file name. curl -O <https://your-download.com>~

Sometime you want to download from a link that redirects in the browser. You can simple do this with the redirect flat -L curl -L <https://reference-link.com>

find | GNU Coreutils

GNU find is an incredibly useful tool that allows for rapidly finding files by data like file name or type by using filesystem metadata and inodes. However the syntax can be a bit finicky, its quite simple in reality.

Roughly, the syntax is as follows find <path_to_search> <test_arg> <test_expression> As an example, find /etc -iname crontab Is a fairly common command. We say to search in directory /etc Then we say to test search results using the 'case-insensitive filename' argument And give the filename 'crontab' as an expression, for the file we want to find. If you want to search in the current directory you can provide the universal 'same directory link' . ie find . -iname contab

qutebrowser

qutebrowser can optionally use 2 dependencies pdfjs for viewing PDF's in the browser python-adblock for adblockng

awesome window manager

There isn't an explicit way for removing a widget, but you can set it to nil with widget:set_widget(nil)

Make | GNU Coreutils

It may be desirable to build in a separate directory It is possible to use the common 'install' rule to hack around this by using the -e flag you can override variables using environment variables However, this assumes there is a suitable variable that an be overridden.

imgui - Lightweight UI toolkit

GitHub - ocornut/imgui: Dear ImGui: Bloat-free Graphical User interface for C…

Tracy - Profiler

GitHub - wolfpld/tracy: C++ frame profiler

Valve Steam - Proton

A fork of various wine related projects to provide a robust gaming experience on linux

When running Proton manually it may complain about not having a compat data path The path can be supplied with the environment variable STEAM_COMPAT_DATA_PATH The variable should expect a compatdata directy for the app being run, similar to a wine bottle. For example STEAM_COMPAT_DATA_PATH="$HOME/.local/share/Steam/steamapps/compatdata/395230"

Pressure Vessel

Pressure Vessel is a simplified container like flatpak which allows game specific runtime environments to make games easier to test for You can enable verbose logging for pressure vessel with the enviornment variable PRESSUREVESSELVERBOSE=1

Krita

Krita is a art and animation program that rivals photoshop in terms of pure art. Whilst having a respectable community, endless customization, and being free and open source.

rclone

rclone is a very useful tool to move data between various cloud providers, or even custom ones

WARNING: If you do not use the --update flag you risk destroying newer folders Even then, it still carries the possibility of updating the wrong folder

lsd

fd

lolcat

sl

nvtop

lutris

qemu

QXL falling back to low resolution through spice

Sometimes this can happen as a resolution of a regression at QEMU 4.1 As a work around you can create a qxl device like the following

-device qxl-vga,max_outputs=1...

ripcord

ripgrep

zsh

binwalk

rofi

Rofi is a dmenu like list selector and viewer primarily for Linux based systems. It is a very fast and intuitive way of going through desktops.

Installation and Setup Notes

rofi requires a functional locale to be set. ie the LANG environment variable. If you do not want to want to set this globally on KDE Plasma, you have to create a file in $HOME/.config/plasma-workspace/env ending with .sh, this should contain commands which set the environment for the session.

Programming

Cache Line Optimization

A cache line is the smallest chunk of memory that can be fetched.

If all important operations are fetched in one single block, any operations with that block can happen extremely quickly, since you aren't waiting on memory fetching operations. A good target to aim for is around 64 bytes for modern x8664 computers.

Concepts

Safety of Experimental Programs

In some cases, programming can be quite a scary thing, since there is very little lines of code between everything running smoothly, and deleting huge amounts of hard work. This means, where loss of data is concerned, a proportional amount of effort should be put into ensuring no data is lost.

Testing This is one common way of increasing the safety factor is with careful testing, ideally automated tests.

Safety / Arming Mechanisms In the most extreme examples of creating safe systems, the programmer might want to ask the user to manually enable dangerous or destructive operators, like with 'unsafe, 'force' or 'safteyoff'. A real world example of this would be an 'Arming Lever', and are fairly standard practice with dangerous machines, like combat aircraft.

In the software world, this could take the form of a simple --unsafe~ flag, or similar.

A more manual solution is the common --dry-run (or n) flag, which is normally used to make messages on what the program thinks it should do, but does not do anything real.

CPU Interupts Timing Precision

Everything runs on core 0 by default so other cores should not have interrupts, interrupts cause imprecise timing since their execution will be disturbed by other scheduled processes.

Thread Blocking

Thread blocking is supposed to pause execution when nothing is happening, it happens that currently in the majority of cases there is a trade-off with precise, granular thread-suspension using sleep() functions that isn't present and scheduler delays and saving on resource usage and CPU saturation.

Unit Testing

Unit testing can actually be a much more effective specification, since it is actively enforced by a globally run program that will tell a programmer exactly what's wrong with the program and what it should do.

Although in practice that can be quite difficult to create, and tiresome to maintain.

Signed Distance Fields

A distance field is a mapped set of distances from an object A signed distance field is an offset distance so you can detect if something has passed a certain distance threshold by checking for a positive number

Tools

ISPC - Intel Implicit SPMD Program Compiler

ispc is a compiler for a variant of the C programming language, with extensions for "single program, multiple data" (SPMD) programming. By leveraging the hardware extensions ispc exposes you can improve the performance of the program by orders of magnitude, by executing operators in parallel on the hardest, even on single threads.

C Language

Boolean values

There are some macros defined in stdbool.h to provide "boolean support".

Lua

todo function
local function TODO(...)
   local messages = {...}
   local message_string = ""

   for _, x_message in pairs(messages) do
      message_string = message_string..tostring(x_message).." "
   end
   assert(debug_ignore, "TODO: "..message_string)
end

Linux - tips and tricks

trash-cli is a program to send files to the recycling bin instead

of unlinking them immediately

date~Get the current iso time with your shells ~date command

seconds: date +%Y-%m-%dT%H:%M:%S

nc inspect UNIX domain sockets with openbsd-netcat

This can be a useful learning tool as well as simple shell debugging.

Manually manage file locks with flock

This is a util-linux program that can manually manage file locks from the shell.

Increase Memory Page Limit for Processes

sudo sysctl -w vm.maxmapcount=655300

Manually Load Filesystem from initramfs

exec switch_root /mnt/root /sbin/init

Aggressive Copy Operation Bottlenecks

It appears there maybe be some bottlenecks with Linux in specific circumstances, particularly with page cache thrashing during heavy read operates that intend to discard files, but actually leaves it stuck in the page cache, locking up resources that could be used for buffering copies instead.

Side note… I don't get how is the case, I thought this was the point of buffers? Memory where is allocated entirely to allow for copy operations to move through memory? Why is it using the page cache? And why aren't programs telling Linux it really doesn't need to be caching this information? [2021-01-31 Sun 21:40] So it turns out the buffer cache in fact, has nothing to do with buffers of any kind, and rather is an in-between cache that is read before accessing block devices.

*ZFS

mkinitcpio fallback initramfs archlinux

probably should be generated separately to prevent a bad config from ruining both the

*primary and fallback initramfs

Cal is a command that shows you a calender

ix is a program that lets you create a pastebin quickly

GRUB

The root is an important place and any files you try to access will be accessed off of the current root. GRUB offers the search command as a part of the utilities and rescue shell. This is used to find and set GRUB's current root, an important step in finding the correct boot files using a number of optional identifies related to your disk.

maintain a list of installed files just in case something goes wrong with the system

Caveates

Artix Linux

Mirrorlists are messed up, sometimes its difficult or impossible to get new packages after sitting on updates for a long time

Shell | Info, Tips, Tricks

Retrieve history with the history command

Most people will learn pretty quickly that you can use key binds like C-r to drag up a recent command from the history. But actually, for bash, this has a dedicated built-in shell command, with more arguments. Simply history For example, you can clear the history in its entirety history clear Or temporarily increase the history length to 10,000 entries history keep 10000 Or redo a command if you can't or don't want to use the shortcut history redo

Unfortunately zsh does not have the same facility. Although, you can still at least show the history with history

Specify a user with '~' expansios

Most users of unix style shells will know that '~/' will expand to the home of your current user. But actually, you can specify any user this way, and it will look up the user's home, if they exist. ie ls ~root/tmp Will visit /root/tmp

binwalk can be used as a binary diff

Using the `-W` switch for hex dump you can obtain a diff of two binary files.

rainbow is a really cool project to easily color output based on regex

A simple colouring would be something as follows ls -la | rainbow --cyan "zsh" --reset-before "\n" This would run ls -la and colour lines matching "zsh" cyan

disown is a command to detach a process from the controlling terminal
Global Environment Variables

The best place to set environment variables globally is in `/etc/profile`. `/etc/environment` is a good secondary location. This is shell agnostic way of setting environment variables. `/etc/profile` is used exclusively for login shells.

Shell Specific

`fc` is a shell specific command to manipulate the shell history.

Wine

Useful Environment Variables

DXVKCONFIGFILE=/data/apps/titanfall-2/dxvk.conf DXVKHUD=compiler,fps,version DXVKLOGLEVEL=warm DXVKSTATECACHEPATH=/data/apps/titanfall-2 WINEFSYNC=1 _GLSHADERDISKCACHESKIPCLEANUP=1 _GLSHADERDISKCACHESKIPCLEANUP=1 _GLSHADERDISKCACHE=1 _GLSHADERDISKCACHEPATH=/data/apps/titanfall-2/nv-shaders

Config files

Config files for KDE apps are 100% pure text generally speaking. Dialogs and popups can be resized based on desktop resolution Although their placement is heavily fragmented

ZFS

It's imperative that you set a max quota for storage snapshots, otherwise they will quickly overwhelm the system, and it is a nightmare to revert back

Windows

Searching for a File by Name

Windows File Explorer search is pretty atrocious, thankfully there is a respectable command line tool build into Windows / cmd.

You can use the following command to find a file, simply navigate to the directory you want with cd, then use the dirs command. dirs /S <search_query> Example, usually done with wildcards, which means "any character match here". dirs /S ~screenshot*.bmp

This would match a file like, screenshot_2020_6_14.bmp, but not my_screenshot.png

Powershell has somewhat, esoteric syntax to making a symlink, and the original mklink is a cmd built-in command. So you either have to defer to command prompt, or use the included version, on Windows at least.

The syntax is as follows New-Item -Path <link_path> -ItemType SymbolicLink -Value <original_path>

The Path (cmd)

You can temporarily set the path using the path command. The syntax for it is simply path <C:\some\new\path>

You should ignore the temptation to quote the path in strings, cmd doesn't care about the notion of a string- especially for the path anyway -and will just insert the quotes and backslashes literally.

Most often you want to append to the path, for that, get the value of the path, separate the value with a semicolon (;), and then append your value. path %PATH%;<C:\appended\new\path>

The Registry

The Windows Registry is a giant database of keys and values, and it used extensively by the Windows kernel and operating system to store long term settings and record small notes about changes the system, like an app being installed.

You can interact with the registry in two main ways.

The first way is the GUI from the executable regedit.exe found in C:\Windows.

The second way is through the command prompt (cmd.exe) with the command reg. You can retrieve a list of commands in available to you using reg /?, although, the help strings are rather obtuse.

Also, rather confusingly, "keys", and "value names" (technically keys by another name) are distinct things, and its completely possible to have a value name MyValue under a key also called System\MyValue. And System\MyValue can actually have another key, by the same name, System\MyValue\MyValue.

This is very much worth bearing in mind, since the syntax mimics path names, but it is a clearly a little different.

For the most common operations, they generally follow reg <verb> <keylocation> <optionlist> For example, to query the values of a key, reg query HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\Search\

To add a basic DWORD32 value, reg add HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\Search\ /v BingSearchEnabled /t REG_DWORD /d 0x0 -v - specifies the "value", or key name to be written to /t - specifies the "type" of the value, usually a 32-bit DWORD /d - specifies the "data", or value to be written to /f - optionally forces the action and bypasses confirmations prompts

To delete a key, reg delete HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\Search\ /v BingSearchEnabled /v - specify the value name to be deleted /va - optionally delete all values under the key

Registry keys to change install

reg add HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\Search\ /v BingSearchEnabled /t REG_DWORD /d 0x0

Environment Variables

Windows has a few lesser known environment variables that can be read to and written from, these are in essence shell variables, using the %<var>% syntax to read from. They work in Windows Explorer, and shells. Here is some of the more useful enviornment variables

Partitioning

For various reasons, it is advisable to separate a dedicated (small) Windows from the rest of the system. To open up the Partition Manager, search for diskmgmt.msc in start or the run prompt with Win+R, or, in the Windows Command Prompt cmd or powershell

Alternatively, you can use the diskpart utility in cmd or powershell

As for the reasons for separately partitioning Windows. Many are due to Windows quirks Here are a few.

all their system files, whilst this is fine for personal files, its not fine when you have literally hundreds of gigabytes of apps

And shortcuts like 'Documents' won't allow you to go "back up" a directory easily, back to the user folder. Having most of the files you care about on a separate, top level directory like 'D:' can make life easier.

Hibernation install

By default Windows enters hibernation on shutdown. This provides all manners of issues, from leaving the filesystem in an unstable, uncertain state, where it could mess corrupt the filesystem if you try to boot Windows after modifying it offline. Or causing annoyances like polluting memory and the desktop with processes and applications left over from the last session. Or just leaving a large, hidden, hibernation file on disk when you really need the space.

The idea behind hibernation is to improve boot speed and get you into the workflow faster. However, with modern computers, and SSDs, this often isn't the case.

You can disable it from the command line… And only from the command line, with the following command. powercfg.exe /HIBERNATION off or powercfg /H off/

Alternatively, if you can't, or don't want to permanently disable hibernation, you can simply run a command to shutdown ignoring hibernation shutdown /s /f /t 0 The flags have the following meaning /S - Shutdown the computer /F - Force applications to quit without warning the user (warning, will destroy data) /T 0 - Time out period, set to 0 for instant shutdown Bonus /C <comment> - Add an explanatory comment for the shutdown, replace <comment> /E <note> - Document the reason for the shutdown, replace <note>

Service Management

sc is the commandline alternative to managing Windows services

Using this you can tell Windows to restart the service within 1 seconds or less on fail

Automatic Logon install

Run netplwiz from the run prompt (Win+R) to disable password required login

Disable Automatic Recovery install

Run an elevated command prompt and run this command bcdedit /set {default} recoveryenabled No

Hyper-V preventing VM from booting install

Disable nested virtualisation by changing module paramaters modprobe kvm_intel nested=0

Serices to disable install

Alljoyn Router Service
BitLocker Drive Encryption Service
Connected User Experiences and Telemetry
Diagnostic Policy Service
Downloaded Maps Manager
File History service
GameDVR and Broadcast Service
Geolocation Service
Internet Connection Sharing
Netlogon
Parental Controls
Retail Demo Service
Windows Biometric Service
Windows Connect Now
Windows Defender Antivirus Network Inspector Service
Windows Defender Antivirus Service
Windows Defender Firewall
Windows FAX and Scan

If you don't use windows backup

QEMU

Intel GVT-g

Requirement | Memlock

For intel GVT-g to work properly it needs to have a sufficiently high memlock This can be set in the /etc/security/limits.conf file, for example

*                soft    memlock        unlimited       # Allow
  applications to lock more memory
*                hard    memlock        unlimited       # Allow applications to lock more

This sets the RLIMITMEMLOCK or memlock limit to unlimited

Passthrough

gtk based display appears to not work at all with Arch official's QEMU build Be careful with turning off the qxl adapter as the state may be difficult to recover dma-buf appears to not work properly with Arch's official QEMU build

Hyper-V Elightenments

Windows has Hyper-V hypervisor cooperation which can improve performance somewhat. Unfortunatly, the hv_vapic flag is actually slower than using a relatively modern CPU with native virtualisation support ((starting with Ivy Bridge-E and Haswell Intel CPUs). So it should be avoided.

Game Development

Blender

Show polygon count

Viewport dropdown in the top right of the viewport > Statistics It should now show in the top left

Dotnet

Cleaning up properly with dotnet clean

For some reason the dotnet clean command doesn't really "clean" by default, at best it just prunes some of the binary and artifacts. This is very not ideal, especially when dealing situations were files are mass committed, you most defiantly do not want build artifacts cluttering your source control.

Here I have a example of a trick I found online which you add to a .csproj file, inside the <Project> block that helps to alleviate this problem.

<Target Name="WroughtDemo"  AfterTargets="Clean">
    <!-- Remove obj folder -->
    <RemoveDir Directories="$(BaseIntermediateOutputPath)"/>
    <!-- Remove bin folder -->
    <RemoveDir Directories="$(BaseOutputPath)"/>
  </Target>

Credit: The Ultimate Guide to Getting Started with MonoGame and Visual Studio Code | by FloatingSunfish

The target in this extract is the name of your project and the RemoveDir appears to be a hook of sorts, that runs at a specified time during a build or similar command running. This is kind of a problem because it implies there is no normal way to handle the cleaning up of things properly, so the command is essentially useless, yet, it exists, and is tempting to use.

This is essentially no better than rolling your own solutions with scripts. But this does cover the case of dealing with innocently using the dotnet clean command nicely.

Creating a new project from a tempalte

Here is an example dotnet new console -o MyProject the -o argument specifies a name for the project

Installing a nutget package globally

You can install packages "globally" with a command, this doesn't require system access and likely only works for each user. dotnet tool install --global dotnet-mgcb

Restoring dependencies

Although many Dotnet CLI commands impractically run dotnet restore, sometimes it breaks and must be run manually, if this happens you can run dotnet restore This installs the Nuget packages "globally" to the user's system and is not a part of the project.

MonoGame
  1. Creating a MonoGame Template Project

    The following will create a cross-platform OpenGL based MonoGame project dotnet new mgdesktopgl For this to work you must have the template installed

  2. Resources

    MonoGame Content Builder

  3. Content Pipeline

    The MonoGame Pipeline is a system which exists to optimize and manipulate game assets, particularly textures and sprites. The MonoGame Content Builder (MGCB) is a tool that converts assets into MonoGame assets, this is a NuGet package that can be installed globally or as a part of a dotnet project. Here is an example of a command to install the toolchain globally using NuGet dotnet tool install --global dotnet-mgcb

    Absolutely all of the content generation information is kept track of and stored in the Content.mgcb text file usually stored in the Content folder of the project.

Wizarding Tricks

Quake's Fast Inverse Square Root

The Quake Fast Inverse Square Root is a modified Newton-Raphson iteration. This is an approximation method that refines the result as the iteration repeats.

Unreal Engine

Labels | Actor Names | Unreal 4.22

It is possible to programmatically set the "label" of an actor, the label is the unique name of the actor, for editor purposes only.

To set the label, you have to convert the blueprint to an EditorUtilityActor, this can be top in the toolbar of the blueprint editor. Blueprint Editor -> Toolbar -> Class Settings -> Details Panel -> Parent Class -> Editor Utility Actor wat https://i.imgur.com/Ru1M2nM.png ]]

Persistent Level Actor References

It's natural that at some point you want to pro grammatically interact with your level. However, a common issue, particularly with level blueprints, is understanding the flow of data in Unreal, and knowing how to access what you want. Fortunately it is quite trivial to get a level actor reference.

Rather unfortunately the way of doing so is quite surprising.

You have two main ways

World, Scene, Actors.

An actor is a type of UObject that has the facility to be placed in the world, the physical space in the game.

A USceneComponent is a movable object in a world, if attached as a RootComponent of an Actor, it gives the Actor a transform so it can be moved. A custom class may not create its own RootComponent, to do so, write something like the following in the constructor of you class. RootComponent = CreateDefaultSubobject<USceneComponent>(TEXT("Origin")) Here the TEXT macro is used to set the Outliner name for the object.

Hot Reloads

Hot reloads can be a very useful too for fast iteration without restarting the editor or game. Simply run HotReload <YourModuleName> in the console, case insensitive.

You should be very careful, it is not magic, prone to crashes, and can cause repeated crashed when hot reload artifacts are left around between restarts.

A general built will clean HotReload runtime libraries.

Console Commands

Unreal Engine has a very robust command console that has many built in variables and executable functions, however, many of the commands have limited documentation, or just straight up don't show in the autocompletion box at all.

Case Insensitive It's also worth noting all commands are case insensitive.

There is a 3rd party website that lists as many commands as possible here UE4 Console Variables and Commands

brackets <> signifies a user substituted value.

number is "on", very useful for performance/simulation debugging

Unreal Build Tool (<>)

The Unreal Build Tool is a cross-platform build system and pre-processor that both provides a robust project builds and provides powerful pre-processor time macros that makes working with the complex mechanics of Unreal Engine more accessible. It can be extremely powerful and allow for a very robust build system, however, this has one major downside. It absolutely destroys any ability to get sane linting and static analysis for code. Since, much of the code provided is introduced at preprocessing time almost no code aware tools can reliably work with the actual code that is generated. For this reason is it wise to not rely on the syntax checking and static analysis when using Unreal, the compiler is functional, work on getting down iteration and build times instead of waiting on static analysis.

UBT is reliant on the dotnet environment since it is written in C#. This means the respective enviornments should be installed on the machine. Usually this means dotnet for Windows and mono for Linux. The UBT is a very versatile tool that has numerous options.

The basics of the UBT is the target.cs and the build.cs, both contain class definitions and a constructor that must include the name of the file it's defined in, without the file extension. They also both have appropriate settings to customize the build for 1 module. The Target, This file specifies what module and what kind of module you would like to build.

The Build The build file specifies how a module should a built, all the various build settings and compilers flags.

When invoking make, you should avoid the temptation to set a parallel build process, the UBT will already parallelized everything, and setting special flags will just try to spawn extra instances of MSBuild and the UnrealBuildTool.

  1. External and alternative tools

    The UBT has the ability to generate cmake files, makefile files and a compilation database. For cmake add the `-cmakefile` parameter. For makefiles add the `-Makefiles` paramater

Rendering
  1. Graphics Pipeline

    The game thread and rendering thread are separated for simplicity sake. The rendering thread usually runs a frame or two behind the game thread. Inter-thread communication is usually handled through the `ENQUEUEUNIQUERENDERCOMMANDXXXPARAMETER` macro. This macro creates a local class with a virtual `Execute` function that containers the code entered into the macro. The game thread inserts the command into the rendering command queue. The `FRenderCommandFence` provides a convenient way to track progress of the rendering thread. The `FRenderResource` provides the base rendering resource interface and provides hooks for memory management. `FRenderResource::InitResoruce` can only be called from the rendering thread. `BeginInitResource` is a help function that can be called on the game thread to enqueue the above rendering command.

  2. Materials

Garbage Collection

Garbage Collection (or GC) happens on the game thread and operates on `UObject`. The game thread may delete a `UObject` while the rendering thread is operating on it. The rendering thread should never dereference a `UObject` pointer without checking for garbage collection. `DetatchFence` is an example of prevents GC from deleting a UObject.

Particle Sim Research

For representing in-world particles the objects being representing do not have any strict requirement to use or inherent from UE4 classes. All that matters is the desired interfaces and used and the resulting object in rendered properly.

  1. Research Links

    Intel Fluid Simulation for Video Games

  2. Collision

    This could be handled internally with calls to the GameplayFramework collision system Could also look at implementing the system into the GameplayFramework collision system without making every sub-component part of the system an actor. Per particle collision would be necessary for accurate simulation. Performance optimization should look to minimize inter-particle collisions. There is the Newtonian and

  3. Techniques

Graphics Development

Shaders

Shaders are a name given to the hardware acceleration method enabled by modern GPUs. They speed up execution by taking advantage of aggressive parallelism.

Each GPU has it's own architectures and drivers so standard x86 instructions are not enough.

Graphics APIs were derived to enable cross-platform cross-vendor interfaces for graphics development. The likes of OpenGL, D3D and Vulkan are modern examples of this.

Each graphics API comes with its' own shader specification that defines a medium in which the graphics driver can consume to perform hardware accelerated tasks. GLSL and SPIR-V are examples of this. HLSL and GLSL are the primary shader language formats, written for DirectX and OpenGL respectively. However, there are a lot of cross-compilation options for all shader types, including compilation to SPIR-V, a byte-code shader format designed for the Vulkan API.

Space Engineers

Programmable Block Control Input on Static Grid

Control input isn't read properly on static grids, attach any control system to a subgrid of some kind

Whips Artificial Hoirzon Script

Fighter Replica HUD Theme

Artificial Horizon - Colors Sky background=10, 20, 30, 255 Ground background=10, 10, 10, 255 Space background=0, 0, 0, 255 Prograde velocity=150, 150, 0, 255 Retrograde velocity=150, 0, 0, 255 Text=0, 229, 39, 255 Text box outline=0, 229, 39, 255 Text box background=10, 10, 10, 150 Horizon line=0, 229, 39, 255 Elevation lines=0, 229, 39, 255 Orientation indicator=0, 229, 39, 255 Space x-axis=100, 50, 0, 150 Space y-axis=0, 100, 0, 150 Space z-axis=0, 50, 100, 150

Remarks

Dictionary

<<<Parametric Equations>>> maths

In maths graphical equation where a variable, for example t, is present in both the x and y expressions.

<<<Ajoint of a Matrix>>> :maths

Synonymous with tranpose of a matrix

<> programming

A situation in where an object is read or assigned as the base class, causing some members to be sliced off, effectively leaving quirky and unexpected behavior when trying to read from or make a call to the object.

<<<Race Condition>>> :programming

A situation where inconsistent order of execution influences the outcome of a given program.

<> programming

An executed process that accelerates a task by performing it in parallel, taking advantage of many threads as opposite to fast single-threaded execution

<> | High Level Shader Language programming

A shader language primarily designed for D3D.

<> programming

A common shader language primarily designed for OpenGL.

<> | Standard Portable Intermediate Representation - V programming

A portable intermediary bytecode/binary format (or language, as they call it) shader designed for multiple Khronos API's, like Vulkan, OpenCL, OpenGL. It is a successor to the SPIR format, which originally only supposed OpenCL.

<> :programming

A modern graphics API that provides low level access to graphics hardware.

<> programming

An old, popular graphics API that is cross-platform and widely supported.

<> Application Programming Interface programming

Application Programming Interface, an intermediate piece of software that facilitates interacting with another piece software, firmware or hardware through a common method of communication.

<> | Render Hardware Interface :programming graphics

A generic term for a graphics abstraction layer between software and a hardware graphics device, or more accurately, a graphics API.

<> | Pipeline State Object

Pipeline State Objects are kind of like object backed settings for graphics hardware. A traditional setup will have the software and GPU constantly communicating data back and fourth, one of those settings were recalcuating and reporting on hardware settings. The PSO aims to partially solve this problem by changing the hardware settings only once per unique PSO. *needs expansion

Integer Overflow

<<<Integer Overflow>>> Integer overflow is when a value too large for a number to store is attempted to be assigned to a variable, the result is some of the leading bits will be truncated and result in the variable "wrapping round" to the 0x0 0x01 value and so on.