Now Reading
Ask HN: Small scripts, hacks and automations you are happy with?

Ask HN: Small scripts, hacks and automations you are happy with?

2023-03-12 12:07:08

I built a script that finds appointment at the citizens’ office in Berlin. At first, it was to test a theory that you have to check early in the morning (turn out it’s false).

Finding an appointment is notoriously difficult, and people often depend on the results of those appointments to start their life here. So I built a UI for the script and put it online. Not just code on GitHub but a live website everyone can use.

It’s been live for almost a year now. Thousands of people got an appointment that way. I got the city’s blessing to keep it online, albeit with a restrictive poll rate.

It’s really cool to see people recommend the tool to each other, and to work directly with the city.

When I return from vacation, I want to ask the city to extend the tool, and also translate it to other languages.

Honestly, the script/hack/automation I’m still most proud of is from around 1983, when I was 12 years old, for my Commodore 64. Back then, all computer communications were over the phone line, and in my house, we only had one line, so I spent a lot of late nights, dialing into BBSs, often not going to sleep until the sun came up. Then, I would get in trouble at school for falling asleep in class. So I came up with a little program I called “Simon”. I named it after a popular kids game called “Simon Says”. It was basically a 1983 version of Selenium. I created instruction files, which would tell the main program (Simon) what to do: the time to dial a BBS, the number to dial, the login info, and then all the commands to send. It did things like download my messages so I could read them the next day. Send messages. But mostly, and kinda shamefully, it was used to pirate games. I had a list of about 10 local BBSs that I would dial every night, and a list of games I was looking for. If Simon found a game on the list, it downloaded the game for me. Then, it would hang up. I even created a hard-hang-up time of 7am, when my mom woke up, because if the computer was still on the line and she needed to use the phone, she would kill me! LOL

I’m still super proud of that because it was so much fun to create. I recall “releasing” the program to a handful of friends, and they all seemed to like it, but it never really went anywhere.

I come across quips and insights that I want to remind myself of; including them in a random message of the day as a ZSH login message is the right amount of “flair” for me. But this is too noisy if it’s a new random choice each time I open a shell. Using today’s date as a source of randomness gives us a random choice that’s the same all day long. So this is in my .zshrc:

    motd()  head -n 1
     

    motd

“`

Not technically difficult, but I remember that it made me feel very proud: Back in high school (2004-ish) I made a small PHP script to generate a .m3u of my whole MP3 collection and I exposed this from my home PC. Since I could install Winamp on the school PCs this allowed me to stream my music collection.

Another one was a password rotation script for my university. There was an idiotic password policy that required you to change your password every 6 months via a web form, and if you tried to change it to a password you had used before you would get an error saying that the new password couldn’t match the last 24 passwords you had used. This information was very useful as it allowed me to write a script that simply changed my password 24 times and then changed it back to the one I wanted.

Most useful script I’ve ever written, and legitimately how I keep track of everything important in my life:

  #! /bin/bash

  function todo {
   if [ $# -eq 0 ]
    then
     vim ~/Dropbox/todo/todo_list.txt
    else
     vim "$HOME/Dropbox/todo/todo_list_$1.txt"
   fi
  }

Sourcing this in my shell means I just have to type…

  todo

To automatically get into my default todo list which is synced to dropbox (and thus accessible everywhere on mobile as well).

While, whenever I have a new project/etc come up, I can just

  todo new_project_name

And get a new, nice clean file, with the same backups and multi-device accessibility.

I use Org mode in Emacs, so I have a follow up:

Do you have a format in your text files to track when things are done, like a checklist?

In Org, you can give things states like “TODO”, “INPROGRESS”, “DONE”, “CANCELLED”, add checkboxes (like “[ ]” and “[x]”), have sub-tasks/lists, etc.

I have really gotten into Org, and I have found there are a few other checklist formats/programs in plain text. Just wondering how you do it.

So certainly not impressive, and morally ambiguous, but when chatGPT first released, I worked with it to create a python script that crawled The New Yorker sitemaps, found articles with embedded audio versions, downloaded the mp3 by reconstructing the file url using the article ID in the page source, and then renamed the file to match the page title and saved it to a well structured Google Drive directory. Now when I walk my dog, exercise, or clean, I’ll put on a random article and have a much more enjoyable experience than a podcast would offer. The mp3s were publicly available so I think it’s fine for personal use?

I’d never written a line of code in my life, so going through the iterative process of getting it functioning and fixing bugs was pretty invigorating. I could see where the generated code was going awry and describe it in natural language, and chatGPT would turn that into syntax and explain why it worked. Definitely a fun way to learn.

I’m sure it’s ugly/inefficient, but here’s the script:

https://pastebin.com/raw/DuAVAikD

“going through the iterative process of getting it functioning and fixing bugs was pretty invigorating.”

Damn right it is! Writing scrapers is definitely the gateway drug of programming. It’s such an exciting feeling to step across the boundaries from “passive user at the mercy of the UI” to “I can do anything if I can conceptualise it (and break it up into steps)”.

You summed up what dragged me from “Sure I’ll learn to program a notepad or todo app so I can learn, but I won’t” to “I can do anything if I think about what I want first, then how to get it!”

+1. Writing scrapers and even API calls were the step for me that went from “I’m making my computer do something” to “I’m making my computer do something that’s interacting with the outside world on terms that I programmed it to. Whoa! I can do a lot of different variations of this!”

Pretty impressive to be able to create such a script when you haven’t programmed before.

I used to freelance when I learnt to code in college and I wrote many such small automation scripts for quite a few people. Looks like the AI is already taking away some jobs 😉

Easily my highest utility script so far was to get an edge while picking classes. At my university there were some classes which were very competitive and there was also system where tranches of students would get access to those classes earlier (e.g. groups of honors students -> groups of seniors -> juniors, etc). I was a lowly sophomore at the time, so I basically got what was left over.

Before I could register I could see most classes I wanted to take had plenty of availability except for one which was almost full. That class would make my schedule perfect (no classes before 10am, everything ending before 4PM, good teachers), so I was ready the moment my group got access. I logged into the janky Java applet and saw that my desired class was full. Dismayed, but not giving up, I refreshed the class list to see if someone might drop during the drop-add week. Nope, still full, but there was still time for someone to change their mind and for me to slip in. I was going to need an edge.

Another peculiarity of this system was that it was definitely a weird Java app based on some type of main frame. It must have been around for a decade at least. One of the signs of its age was that you had a limited amount of time per day during which you could be logged into the system, something like 90 minutes a day.

So I did what any good hacker would: I made _really_ good use of that limited time. I wrote an AutoHotkey script which would automatically log in, traverse all the menus and attempt to sign me up for that class. It was much faster than a human, so each run barely put a dent in my logged-in time allotment. As a cherry on top, I wired it up to a python script to push notify me if it succeeded.

After letting it run for two days, some false alarms, and tweaking, I got a delightful “You’re registered!” notification on my phone and found that it had successfully gotten me the class I wanted. I’m still chasing that high to this day.

This is more in the “hack” category, but here is a solution for bookmarking directories[1] that I am quite fond of:

  mkdir ~/.marks/
  export CDPATH=.:~/.marks/
  function mark { ln -sr "$(pwd)" ~/.marks/"$1"; }

Then:

  mark @name   # add bookmark called @name for the current directory
  cd @name     # jump to the bookmarked location
  cd @<tab>    # list all available bookmarks

It can list bookmarks, auto-complete, jump to sub-directories within bookmarks, all without introducing any new commands – just `cd`.

[1]: http://karolis.koncevicius.lt/posts/fast_navigation_in_the_c…

Glad it is useful. Just a note of advice – prefix your bookmarks with a symbol that is not used to start directory names (was @ in my case).

This is useful for two things – 1) avoiding clashes with existing directories, 2) having a quick command to list all the bookmarks: cd @<tab>

I don’t have the script anymore but when I was an intern at a regional hospital system ~10 years ago they were running into problems with an old wireless system where after a certain global client count and AP uptime APs would stop registering clients or roams. At the start of the day everything was fine, by night some random wireless tickets were coming in, and by next morning everything was on fire and thousands of APs worth of Wi-Fi was useless. The wireless lead was sweating bullets trying to get the vendor to figure out what was wrong but the product was nearing the end of its life plus having transitioned between companies in its lifespan so support wasn’t great. As a stopgap he’d stay late and reboot all of the APs from the controller at night. It’d take about 30 minutes for everything to process and the doctors/nurses would still raise hell but a scheduled blip at night was better than everything being broken in the morning.

We had great AP density so by the 2nd night I started working on an Expect script that would log in and boot 1 AP at a time per floor-building combo (multiple areas simultaneously), wait for them to come back, wait about 30 seconds, and move to the next AP on each floor. At first I did a dry run where the reboot was commented out and the “is it back yet” was delayed by a short timer to simulate a reboot delay + make sure the script would really run without a login timeout or something stupid. This would take a couple hours but it was better than taking everything down as clients under the specific AP being cycled would just roam and have a weaker signal for a minute (assuming the system had been booted the night before). We ran it the 3rd night in place of the mass reboot and watched it carefully. It went great so we left it to run every night. We had the nightly notice from an outage to a potential outage and stopped getting complaints because, except for a particularly unlucky laptop in a corner for 1 minute, nothing was really out of the ordinary. At this point we knew we were buying a brand new wireless system that fiscal year (already in the budget) so we ended up just riding the year out with this script.

In all it was a lame reboot script shorter than the table containing the lists of APs to boot 1 by 1 but it probably had more real impact than any fancier scripts/tools I made while I was there.

Nothing particularly difficult or unique, but adding a simple script in crontab to download new videos from my youtube Favorites playlist has been immensely helpful in archiving over the years, as videos inevitably get blocked or deleted.

    yt-dlp --download-archive archive.txt --no-overwrites -o '%(playlist_index)s - (%(uploader)s) %(title)s.%(ext)s' --format bestvideo+bestaudio --yes-playlist --ignore-errors --write-info-json [PLAYLIST_URL]

Edit: if you use this, note that you should probably sort your playlist by Date added (oldest), otherwise the playlist_index doesn’t work correctly. Or you can just leave out playlist_index from the filename if you don’t care about it.

I use the following tiny script to make PDFs look like they have been scanned. There are still entities that require you to physically sign a document and scan it. So far, everyone accepted PDFs like these:

  #!/bin/sh
  ROTATION=$(shuf -n 1 -e '-' '')$(shuf -n 1 -e $(seq 0.05 .5))
  convert -density 150 $1 
    -linear-stretch '1.5%x2%' 
    -rotate ${ROTATION} 
    -attenuate '0.01' 
    +noise  Multiplicative 
    -colorspace 'gray' $2

I wrote a shitty shell + python script to get me music to DJ. It would:

* Load a playlist from spotify (including fetching the API token programatically from bash… shudder)
* Search youtube for the song + artist
* Download the MP3 of the song from youtube
* Set up the directory structure for Rekordbox DJ
* I could then manually import the song

When I was too broke to buy music for DJ’ing this satisfied my craving for new music to play.

I was teaching online, and it involved a significant amount of grading on the LMS (Learning Management System.) It was a pain to click on the grade entry field for each student, then click “Next”. I wrote a browser extension to autoselect the proper input field, and then on Enter it would both submit the grade and load the next assignment. Saved me hours and hours while grading.

I created a callbox bot using Twilio about 10 years ago. It started as something simple: when my callbox bot is called, play dial tone 9 (to signal the callbox to open the door) and then send me a text message that someone is coming up. I wasn’t too worried about people abusing the system and this worked really well.

Over the years, I’ve made it more complex. I added a simple menu system using text-to-speech and accepting the DTMF tones for the selection. This way I could tell people to enter a secret PIN and they could provision themselves access to my lobby. If they didn’t know the secret PIN, there was an option to have it patch through to my phone so I could screen the guest using normal voice and I could manually press 9 on my cell phone.

It also has an SMS interface where I can select different modes and hours of operations. So when I host a poker night, I’d txt my bot “poker” and it would it update the menu system to be Poker themed so that when guests were at the callbox they’d be greeted with poker puns. When my friend comes to visit from Austin, I’d txt “StarCraft” and he’d get to hear some dumb SC2 puns. I have 20+ different modes now for various occasions over the years. Now when people visit they get their phones out and record the greeting half-expecting something that they would want to remember or share on social media.

Sadly, my building just replaced their callbox with a newer model that does not accept secondary input. So once the call is connected to Twilio, if the user touches any button on the callbox (e.g. to enter a secret PIN), it will disconnect the call. I suppose it’s now time to use some voice-to-text options to bring back the interactivity, but I suspect the lag would make the experience more frustrating than fun.

I used to have a script that would turn my desktop into a “whiteboard for temporary files”. Every file older than 2 days would get copied to some temporary trash folder, and would then be deleted after 90 days of inactivity. I had no shortcuts on my desktop, only files and downloads. Everything there was meant to be temporary, quick sketches. Was super happy with that workflow and it was all a couple dozen lines of bash/powershell (as I migrated between OSes).

In 2004, the website All Music Guide did a horrible redesign. I was so irritated that I created a Firefox extension that did only one thing — it made All Music Guide a bit more useable.

To my knowledge, this was the first “site-specific” browser extension, so I wrote a blog post about it:

https://www.holovaty.com/writing/all-music-guide/

Aaron Boodman noticed my submit and determined to generalize the thought; he created Greasemonkey, which was a single browser extension that would mixture site-specific JS customizations. Within the years since, this concept of “person scripts” has additional developed, and I believe some browsers even assist the idea natively.

In that case, thank you to you both for dramatically improving my web experience — I don’t know that I could stand browsing without uBlockOrigin and ViolentMonkey

I’m not aware of any browsers that support it natively, although TBH there are so many random Firefox forks that any one of them could have just bundled the extension and called it native

Ah, that reminds me of how I modified sites using https://www.proxomitron.info/ , it is a Home windows program that may begin a neighborhood HTTP proxy and you’ll configure browsers to make use of this proxy, and it might block adverts. This labored very well earlier than the net turned HTTPS-crazy.

The best way it might block adverts is to research the recordsdata it was fetching for the browser and modifying discovered key phrases, for a crude instance all IMG tags with SRC containing the string advert.* would get replaced by grey pictures.

I additionally keep in mind utilizing it to “repair” a redesign of a website.

There is a specific small hotel that will remain unnamed, which my partner wanted to book for a long long time. At any given moment if you check the reservations page, it is most likely booked for two years solid. It is famous enough that people book it all the time, but it has lenient cancellation policy of 100% refund if cancelled before 2 weeks. This is not really a problem for them as these vacancies get booked in a matter of hours (as I found out).

I wrote a script on top of Puppeteer that would be called from crontab on my home server, to go to the reservation page, press the buttons to flip through the calendar and extract days with vacancy (the reservation app is an SPA). When there were new vacancies it would send me a message through one of the messaging APIs about new dates available, so I could see if it aligns with our schedule and book immediately from the phone wherever I am.

That actually worked like a charm and I snatched a very convenient reservation after few days.

I think it’s a reference to New Zealand’s Managed Isolation and Quarantine system during the pandemic. Demand was far higher than availability (and quarantine was mandatory to return home from overseas), so at some point someone wrote a script to book rather than just hitting refresh. Soon, everyone was using it and they were forced to move to a lottery system instead.

I made a script that checks CPU load of a remote machine and modulates a looping fan noise on my laptop to give that authentic heavy load experience when running remote builds 😀

oh I’ve been down the “I can just fix macOS” road. The sooner you move away the better. so much time wasted on plugins like these.

you effort will just be null the second apple change the usability apis enabling hammerspoon, like they did countless times before. I don’t even rename the name of the fallen apps before it.

instead now I contribute things as long lived proper features on KDE and hopefully advance humanity instead of apple profits.

This tiny script keeps my gmail inbox clean and manageable.

It auto-archives any emails older than 2 days (you can change this obviously) unless they’re starred – this changes my email experience in two ways:

1) My email inbox ONLY has emails that are either new or that I decided were relevant. I have two days to notice this.

2) When I’m scanning my email, if I see something important that I don’t have time to deal with, I can star it and keep moving and know that it’ll still be visible and in a manageable-size queue later.

When I travel off-grid it is a bit of a pain since I have to go through a lot of archives, but that can be addressed by just turning it off.

I also ruthlessly unsubscribe to things I’ve stopped reading to keep everything manageable.

https://github.com/AlexMakiJokela/Gmaid

One sillier one I look back fondly on was an undersea themed email signature generator I wrote maybe 20 years ago or so.

I think it was a Perl script connected to a named pipe or similar, so whenever my mutt client read it I got a new sig.

It generated 4 lines of an ASCII art underwater scene, containing a randomised selection of different fish, sharks, boats, seaweed, crabs, rocks, shells, treasure chests, etc.

There was a bunch of code to ensure no overlap, and more natural looking scenes.

There were also certain things that only triggered on certain special days of the year (Christmas etc.), and others that only appeared in conversations with people I’d exchanged more than N emails with.

Sadly it went away when my email client moved to be primarily web based.

in 1995, I thought that it would be good if advertisements could be served from hubs to large numbers of web sites. That would mean that advertisers wouldn’t have to hire ad space salespeople, which was far too expensive for most sites. But then there was the problem of how to decide which ads to show to which visitors to those sites. To do that, it seemed like it would be a good idea to base that decision on the sites people tend to visit, which seemed like it would maximize the probability that they’d actually be interested in the ads.

I looked for a way to do that. Browser cookies seemed like a possibility, until I learned that one site couldn’t read another site’s cookies. When I asked people if there was any workaround I was told that there wasn’t. But then I figured out how to use them to track people anyway. That hack is my answer to the OP.

I filed a patent in 1995 that showed how to do it. The patent tied that technique to my specific mathematical technique for selecting the ads the user would most likely be interested in, so it could not be used to restrict other people from using cookies that way as long as they didn’t also use my specific math. The patent also contained a lot of features about how to give users complete control over their information and that protected their privacy. (That part didn’t get implemented much!)

I eventually sold the patent to DoubleClick, and DoubleClick was eventually sold to Google for an amount I recall to be in excess of $1 billion. By then I’d seen a marketing site recommend that Google buy DoubleClick, with my patent listed as one of the reasons they should. DoubleClick had their own patent describing the tracking cookie, but their priority date was a couple months later than mine. And it didn’t have the privacy features mine did, or the same degree of tech about how to actually select ads.

In defending against a patent troll in 2021, Google and Twitter together filed a joint brief calling the tracking cookie “Robinson’s Cookie”. My name is Gary Robinson. (In case anyone is curious there’s more info at https://www.garyrobinson.net/2021/07/did-i-invent-browser-co….)

I ran an e-commerce business that used Australia Post for shipping. A few years ago, I was responsible for 90% of Australia Post’s lost parcel/SLA refund customer service volume in one day. Before COVID, Australia Post had a “Next Day Guarantee” for parcels sent through their Express Post network. If the parcels didn’t arrive the next day, customers could choose to get a prepaid satchel or their money back.

To take advantage of this, I wrote a script that tracked our shipments from the previous 12-months and identified which parcels were late. It then lodged a new inquiry through Australia Post’s website. I received a call from the head of Australia Post’s customer service team, telling me I was responsible for 90% of their volume. Apparently, what was causing delays was the need to wait for us to select “get money back” or “prepaid satchel”, and she asked if they could do one or the other for all inquiries. This would have impacted the queue size (guessing one of their KPIs).

In the end, we received several thousand dollars’ worth of postage refunds.

I’m not sure I understand. You submitted one year of late parcel requests in one go (presumably with the script), and then you had to manually select to ‘get money back’ or ‘prepaid satchel’?

On Windows:

– Created a folder called “shortcuts”

– Copy the folder URL into the system path (Control Panel, Environment variables something something)

– Create a shortcut of anything I want to hyperlink to and put in there, with descriptive names

– In IE, I think the setting was to enable autocomplete. For some reason that was linked to the autocomplete of the run menu item in Windows.

– For extra bonus points, add a shortcut of the shortcuts folder and put in the folder.

Result is that you can hit Win-R, type e.g. “sho” and it would autocomplete “shortcuts”, or “ti” for “timesheet” etc etc. Way faster than clicking around for icons, just a half-second until your oft-used app/spreadsheet/folder/website is loading.

Question: how can I do this on Mac?

This is a set of alias/functions that I use to setup tmux and my ssh connections, I’m always in tmux be it locally or in ssh, it survive session disconnections.

  # full terminal page of n
  pgdown () {
    printf 'n%.0s' $(eval echo {1..$(( $(tput lines) - 1 ))})
  }
  
  # clear teminal and TMUX history and get to the bottom line
  c () {
    echo -en "ec"; pgdown
    if [[ -n $TMUX_PANE ]]; then
      tmux clear-history -t $TMUX_PANE
    fi
  }
  
  # force MOTD display in a new ssh session
  alias sshmotd='ssh -o SetEnv=SSH_MOTD=1'
  
  if command -v tmux 1> /dev/null; then
    # open a new tmux session named default or attach to it
    tmuxa () {
      systemd-run -q --scope --user tmux new-session -A -s default "$@"
      exit
    }
    
    # open a new tmux session named ssh_tmux or attach to it
    alias tmuxs='tmux new-session -A -s ssh_tmux && exit'
    
    # run tmux if TMUX variable is null locally or in an ssh session
    if [[ "${TERM-}" != tmux* ]] && [[ -z "${TMUX}" ]] && [[ -z "${SSH_CONNECTION}" ]]; then
      tmuxa
    elif [[ -z "${TMUX}" ]] && [[ -n "${SSH_CONNECTION}" ]]; then
      tmuxs
    fi
  fi
  
  # run MOTD at login only in an ssh session
  if [[ -n "${TMUX}" ]] && [[ -n "${SSH_CONNECTION}" ]] && [[ "${SSH_MOTD-}" == 1 ]] && [[ -f "/etc/update-motd.d/00-motdfetch" ]]; then
    motd
  else
    pgdown
  fi

All of my favorite things are little AutoHotkey shortcuts. I have several that basically just execute a Regex replace on whatever text is currently highlighted. It’s ridiculously convenient to just have those functions ready to go any time.

But I started accumulating so many of them, it got to be a problem just remembering all the ones I had and how to trigger them. I had to start finding new ways of keeping them all organized. That led to figuring out two things that I’m still really proud of and use every day: custom right-click menus and Emacs-style keychords.

The custom right-click menu is: any time I right-click while also holding down Ctrl or Alt, it opens a right-click dialogue run by AutoHotkey, not the default one in the given context. That lets me visually select what I want to happen rather than needing an arcane key combo for every thing. All of my frequently-used Regex functions are right-click menus now.

The key chords are: if I trigger a “prep” hotkey first—my most-used one is “Alt+G” for “Go-to”—it sets up a keyboard hook that listens for the next key and then dispatches to a function based on that. So, “Alt+G H” would be really easy to remember in my head as “Go to Hacker news.” And it pops open a new tab in my browser. I use that one a lot for opening desktop applications—”Alt+O C” maps in my head to “Open vs Code.” Super easy to remember.

I’ve automated 80% of the most common navigation items at my job this way. My coworkers have seen me do it and stopped me to ask, “Wait, tell me how you did that.” It looks and feels like magic.

I’m approaching your threshold of “I started accumulating so many of them, it got to be a problem just remembering all the ones I had and how to trigger them”

Can you share a repo with your “custom right-click menus and Emacs-style keychords” to save me some time?

Love hacker news!

Another interesting trick is to have a capslock layer (or any other “useless” key you may have, but capslock is much easier to reach obviously) with something like:

    ; ahk v1 code
    CapsLock::
      KeyWait, CapsLock
      If (A_PriorKey="CapsLock")
        SetCapsLockState, % GetKeyState("CapsLock","T") ? "Off" : "On"
    Return
    #If, GetKeyState("CapsLock", "P") 
      u:: ; convert selection to code in markdown by pressing CapsLock+u
        SendRaw, ````
        SendInput {Left}
      Return
      ; ...
    #If

I also find myself with way too many obscure Mouse button 2, Ctrl, Alt, Capslock, F12, etc. incantations I never remember and end up opening the script to search them. I thought about adding pie menus based on context but never got to it. Would you mind sharing a minimal example of your right click menu? Sounds much simpler and accessible. The keyboard hook à la vscode chords also seems pretty useful, will definitely implement that

I did something similar but using Autoit

I used the dialog box to create little ‘applications’ such as listing all the printers in the building (along with description) and allowing the staff to choose the printers they wanted to use (back around 2005 ish)

I had loads of these little scripts for staff to use – and some server side stuff (mainly windows) such as generating daily configs and collecting all the logs and looking for differences

I pretty much automated half my job, allowing me to do more interesting stuff

One of my servers is an older Supermicro X11, and I’ve flashed its BIOS to be able to boot from an NVMe drive in a PCIe adapter. While this normally works fine (I had another identical board it worked flawlessly on), this particular one for some reason doesn’t always see the drive. Since the server is off most of the time, but wakes up daily to ingest ZFS snapshots from the NAS, the ability to turn on and off reliably is important.

I didn’t feel like doing that deep of a dive to determine why it sometimes sees the NVMe drive, so instead I wrote a script [0] that uses IPMI to power cycle it until it can get an ssh connection. Originally I was sending a magic packet, but realized that was only a one-shot, so I switched to calling the IPMI executable via subprocess. No idea why I left the other stuff in there.

Anyway, this has reliably worked for well over a year at this point, dutifully restarting the server until it comes up. Some days it takes 1 attempt, some days it takes 5, but it always comes up.

[0]: https://gist.github.com/stephanGarland/93c02385e344d8b338aab…

I got annoyed with unlocking my
ebike with an app every day, so I created a car key enclosure with a Bluetooth chip to unlock it. So much work! Creating a custom PCB, writing software for the nRF52 chip, learning about BLE, security levels, GATT. Deep sleep and ultra low power usage. I use it every day and am really happy with it.

Here are two scripts I now can’t imagine living without:

[1] is the reason I no longer need a taskbar/dock. It’s a sort of framework you can use to build a muscle-memory-friendly window switcher in Linux, similar to pinning windows to the taskbar in Windows — map Win+1 to terminal, Win+2 to vs code, etc. If the app isn’t running then it launches it; else, switch to it. (I wrote this because couldn’t find a consistent and foolproof way to do this in Linux, and if you switch window managers you have to start from scratch)

(For better ergonomics, if the keyboard I’m using doesn’t have a win/meta key on the right, I rebind right control to win/meta)

[2] is a timestamped log file for you, a human. Set up a cron job to launch it every hour to note what you were working on, or append lines from the terminal whenever you’re chewing on a hard problem. Then, use the data to build an accurate picture of what you worked on during the last week, or grep last quarter’s log to find out why you decided to use library A instead of library B. I have used this since 2018 and frequently go back to gain insight into the reasons for old decisions.

1: https://gist.github.com/cbd32/cbec9a32b32bd9e93b0d2696c71b5f…

2: https://gist.github.com/cbd32/f1ee2967ec0181b934639c30f4e68f…

In my student days I bought and sold a lot of stuff online on a marketplace website that I guess was like Craig’s list. The default sorting was by “recently posted” and your ad would very quickly get bumped off the front page. You couldn’t post duplicate ads, so I made a Python scraper that would download all my ad text and photos, delete the ads, then recreate them, which meant I could stay on top of the page whenever I ran the script (which eventually I ended up running as a cron job of course).

Dunno if I’m proud of it, but I sold a bunch of stuff, most of which I’d bought by having other scrapers on other sites alert me when ads meeting my criteria were posted.

I used to run a site for tutorial screencasts. My “video workflow” involved recording, adding an intro and outro, transcoding to multiple formats and finally uploading the videos to the Internet Archive.

I automated everything except the actual recording using a Makefile that’s over here https://gist.github.com/nibrahim/2466292. I believe I added some sox instructions to scrub up audio too in a later model however misplaced the file.

One other one I did was a small script to create ruling sheets for calligraphy. It is a tedious course of by hand and having a script create a PDF based mostly on nib width is a good time saver https://github.com/nibrahim/Calligraphic-Rulings

I have started making bash scripts for converting various backup snapshots from one tool to another. So far I’ve covered:

– Borg to Restic
– Borg to Kopia

They work quite fine, though Restic and Kopia lack the ability to change the root directory of the backup so if your script extracts to a unique temporary directory for each snapshot then you got some confusing output about no files being cached (Kopia) even though it technically deduplicates them quite well. (I suspect Borg doesn’t have that ability as well.)

Haven’t published them yet but I think I will sometime this year. If somebody pings me I’m very likely to work on them a bit more and publish them though.

Definitely Kopia. I was skeptical at first, it’s a relatively new tool and not much is known about it (at least on HN) but it’s uber-fast and compresses extremely well. Detection of duplicate blocks seems both more aggressive and faster compared to Borg and Restic.

It’s honestly a win on all fronts. It can sync to remote places — which I don’t care about as I have a local directory where it does its stuff and I then distribute that with `rclone` to several encrypted cloud accounts — BUT many people do care so to them it’s an even bigger win.

So yeah, I tried Borg and Restic and Kopia and Kopia wins by all accounts: size of backup directory and speed of operation at least.

(One thing I should mention is that Kopia is also very aggressive at eliminating duplicate snapshots and sometimes you might find yourself lacking snapshots for e.g. five days as it happened with me; however, upon very close inspection — think unpacking the two snapshots before and after that period, and checking diffs, it turned out that I lost nothing, everything I wanted was still there. But, it’s a disclaimer one way or another. Plus, its policy for retaining stuff is not like that of e.g. Borg. Make sure to read up a bit on it before using. Again, it never loses data but it has a different opinion on what is “the last 10 hourly snapshots” compared to Borg.)

Thanks a lot for sharing.

I have been very happy with restic. It has worked flawlessly. The only thing missing from what I rather to have is error correction and asymmetric encryption.

I might look into Kopia when I have time. I was a bit worried that it’s newer and might be less reliable.

On any system I’m working on I add a stupid simple script named `can` that moves items to the trash can. I can’t say how many times it has saved my bacon avoiding accidental permanent deletions over using `rm`.

One of the simplest piece of code, to help a colleague from a non IT department. Excel VBA hash count to show duplicates. Their management instructed them to use a copy/paste nested formula trick which was exponentially slower. It was 1 second now. Turned her week of dread (thousands of rows) into 1h of simple work. 8 Lines of code, 6 minutes.

If you add a .CMD file as the first entry in a .ZIP, with compression set to “none”, you can rename and run the .ZIP as a .CMD, because the PK header will be treated as a nonexistent command and execution will continue after that. With a little imagination and some PowerShell I’ve made that into a self-extracting archive. It’s a dirty hack, but I’ve had it working in production for years now. I want to see if I can make the batch file chimeric so it can run in a linux shell as well. Maybe that’ll to be my lazy Sunday afternoon?

Right now I’m just using the PowerShell part to move the cursor up and to the left and delete it. You can still see the PK$#@($!!!1 for a nanosecond, but that just gives it character in my opinion.

Treat git directories, recursively. Is especially like the fact that I can use it together with git aliases, e.g. I use this at least once a (working) day like `gr -b /path/to/customer/git/repos fap` (–fetch –all –prune).

  #!/usr/bin/env bash

  set -o errexit
  set -o nounset

  usage(){
    printf "[TRACE=...] %s [-b /path/to/clones/root] (%s/w) -- <git commands/options>n" "${0##*/}" "$HOME"
  }

  if [[ -n "${TRACE-}" ]]; then
    set -o xtrace
  fi

  trap 'set +o xtrace' ALRM HUP INT TERM EXIT

  : "${GIT_BASE:=$HOME}"
  while getopts b:h OPT; do
    case "$OPT" in
      b) GIT_BASE="$OPTARG";;
      h) usage; exit 0;;
      *) ;;
    esac
  done
  shift $(( OPTIND - 1 ))

  while read -r REPO; do
    [[ -d "$REPO"/.git ]] || continue
    [[ "$REPO" =~ .terraform/modules ]] && continue

    tput setaf 8
    printf "%s … %sn" "$REPO" "$( git -C "$REPO" config --local --get remote.origin.url )"
    tput sgr0
    if [[ $# -eq 0 ]]; then
      git -C "$REPO" status -sb
    else
      git -C "$REPO" "$@"
    fi
    printf "n"
  done < <( fd --one-file-system --type d . "$GIT_BASE" )

back in the covid period, i wrote a small set of scripts to automatically attend my university lectures on zoom:

https://github.com/WantGuns/auto_meet

it does the naked minimal of:

    - parsing calendar occasions (which have meet hyperlinks)  
    - open zoom meets on lecture's beginning time  
    - exit meet meets on lecture's ending time  
    - shutdown machine when all of the conferences have been over

my sleep schedule was f-ed as much as say the least. these scripts helped me keep my attendance 🙂

I don’t have it on me but I worked for a company that was manually generating Let’s Encrypt Certs (about 20-30 of them) and manually deploying them every 90 days.

I automated the renewal process using TF and integrated it as much as possible with other services. I then integrated it into a CI/CD pipeline. It took DAYS to do these renewals (ironically of course, $30 per year for a fixed certificate would have been cheaper than the people time but #startuplife). It became a 30 minute process. It only took me two days of interrupted time to build this solution and get it working.

Still going after two years so it’s already saved about 900% on the labour costs.

A small workflow that I come back to all the time is generating curl commands with dynamic values through SQL queries to run something against our API. People don’t always think about that so once you have it in your tool belt it’s a great tool you’ll reach to often.

Something like the following, then just copy it in a shell script, chmod +x it and execute it:

    select 'curl -XPOST localhost:9090/apis/some-service/' || p.technical_name || '/locale/' || l.full_locale

Back in the modem days, I tapped into the pins between my modem and computer’s serial port and put a speaker between RX and GND, with a switch of course so I could turn it on and off.

When on, it would make data noises at a reasonable volume indicating the current bandwidth usage. If a download got stalled, it was obvious even if I didn’t have that window in the foreground, or wasn’t even sitting at the computer.

Also, different kinds of data had different sounds and cadences, and I got used to many of them. It was fun to hear an email coming in before my sound card officially announced it.

Problem: My kids and wife would start Air conditioner at lowest temperature (16 degrees Celsius) and forget to switch them off. At one point in time my electricity bill was 50% of my house rent.

Solution: I used Home Assistant and Frigate (local NVR) to detect if there are humans present in the room, if no humans for 10 mins, switch off everything.
I actually use a combination of things to detect human presence, some of the things include motion sensor, PC active, user sleeping (10 PM to 8 AM), there’s also a switch I use to turn on sleep mode if I’m taking an afternoon nap (The switch automatically turns off after 3 hours).
I also use a template to measure if the AC temperature is below 22, if it is, I set it to 22 degree Celsius.
The automation above saved me 15-20% on electricity bill.

    # Sort list by most common terms
    alias sorn='sort | uniq -c | sort -n'
    # Most common IP addresses:
    $ cat access_log | awk '{ print $2 }' | sorn

    # Last 30 modified files.
    function new()  head -30 ;
    
    # I just downloaded something, where is it?
    $ new ~/Downloads

Maybe this already exists somewhere and I don’t know about it …

Anyway, when working on feature branches I wanted a quick way to switch to master, pull, switch back and finally rebase.

So I made a little script (https://gist.github.com/lelanthran/6c9bd1125f89e7621364878d1…) that pushes the present department title onto a stack earlier than switching to a brand new department, and permits me to modify again to the topmost department on the stack.

     xgit grasp # Swap to grasp from present <actually lengthy branchname>
     git pull
     xgit pop # Switches again to <actually lengthy branchname>
     git rebase grasp

I take advantage of this now very regularly; after I want to modify to branches I take advantage of `xgit <department>`, as a result of then I can change again simply utilizing `xgit pop`.

I have a Python script, called t.py and a bash alias “t” that calls it. I use it for quick navigation in the terminal instead of “cd” and “pushd/popd”.

The basic format is: “t <shortcut” while just “t” prints out all the shortcuts and waits for user input.

So, “t 1” goes to parent directory, faster than “cd ..”
Then, “t 4” goes to “cd ../../../..”
And of course “t” will print out all the options.

“t a” goes to my main development git worktree, while “t b” goes to place I build from.

But the real star is the history. I embedded “`save_t_history`” into my bash prompt. That’s another scropt that saves the current directory to a history file, then removes duplicates so staying in the same directory doesn’t accumulate. Typing “t h” shows the history back to about 25 unique directories. Typing “t h1” goes to the last directory you were in. “t h5” goes five ago. I be working in tmux in one pane, and then jump to another and type “t h0” to bring it to the same directory.

This is basically my bread and butter for getting around.

I would post the code, but a.) it’s on a work computer so technically property of my company and b.) it’s a mess so I’d recommend others just build their own version.

A hint to get started is that the “t” alias is an alias for “cd `python3 /opt/t.py`”

Wrote a tweet-fitting one-liner that downloads every csv-like file from every government open data portal that’s public on Socrata. Gave me a fun reason to play with FIFO files to do makeshift recursion.

  mkfifo f;echo >f&xargs -E, -a<(cat f) -I{} bash -c "wget http://api.us.socrata.com/api/catalog/v1?scroll_id={} -O-|jq -r '.results[]|.resource.id+"",""+.metadata.domain+""/api/views/""+.resource.id+""/rows.csv""'|tee <(cut -d, -f2|xargs wget -P'{}' --content-disposition)|sed q"|tee f

I run a clipboard logger + Window Title Logger in the background. I am able to find stuff from 10 years back. Works across all apps. Want to remember something just copy it. It might not be secure practice. I regularly encrypt the rolled over logs using 7z.

Would you mind sharing which tools you use or share the scripts? That sounds like a fantastic, no brainer way of saving stuff. I’m guessing there’s a lot of noise too, like when you’re editing code?

In terms of shell aliases and helper functions, I actually go out of my way to not allow myself any.

As a young person, I was on a trip to visit another ISP to learn how a VPN would work between our two companies. Their administrator, a wonderful person with 30 years of experience in the telco-turned-ISP world, had all these incredible aliases on her workstation for managing her fleet. But on the new vendor’s system, a fresh install of the same OS left her almost unable to perform.

My jobs have all had the same opportunity for the same experience, and I’ve tried to leave it alone. I spend maybe 2/3 of my time logged into someone else’s server, and I don’t doubt myself, even knowing fully well it’s a personal preference above all.

I feel much the same way, and for much the same reason. I do, however, have a largish fleet of aliases that I’ve built up over the years.

The vast majority of them are simple short forms. For example `tf` for `terraform` and `tfp` for `terraform plan`. My brain just naturally thknks of the full command when using the short form and, because of this, I’m not at a loss if I’m logged in somewhere without my aliases.

I wrote one where when the tic mark and top row number one were pressed together, it sends a single left mouse click

tic mark and top row number 2 sends a double mouse click.

It helps reduce strain on my right hand mouse clicking finger.

I once hacked a daily retail data polling script to simply make it carry out retries on failed polls. It succeeded about 80% of the time on retry.

This saved me on avg about an hour a day of manually remoting into retail store systems to gather sales and stock data for the central system processing.

That was a good feeling, and was so simple.

I’m not sure this qualifies as small since I’ve been working on it since last summer and has grown to over a few thousand lines of code, but it is definitely automation and something I’m quite proud of.

https://github.com/TorbFoundry/torb

“Torb is a software for shortly organising greatest apply growth infrastructure on Kubernetes together with growth stacks which have fairly sane defaults. As an alternative of taking a pair hours to get a mission began after which every week to get your infrastructure right, do all of that in a pair minutes.”

My ambition is to fill the house of Heroku and proper now I can get one thing like a React App and a Flask API or a full Elixir Phoenix mission began and working in lower than 5 minutes on Kube, with a full infrastructure as code surroundings, reproducibility and extra.

It is undoubtedly not assembly my ambition but, however it’s undoubtedly in a spot the place I believe individuals can use it and get a whole lot of worth from it. I have been testing it with some buddies over a number of months and have been canine fooding it on my different tasks.

I simply completed including a file system watcher to it over the previous couple of days so you possibly can iterate and adjustments will shortly be mirrored onto your cluster. Subsequent I’ll extract this out right into a library from the CLI software, construct an API over it and supply a hosted answer and an internet app.

I have been that means to share it with individuals however I have been heads down constructing for perhaps just a little longer than I ought to have been.

Power saving on lab setups when nobody is using them. they have google calendar for booking. so a daemon looks calendar every hour, if no booking, turns off outlets (the power supplies are remotely controllable over telnet).

One of my favorite spa/hotels for a weekend retreat for my wife and I has absurd waitlists — takes a few months typically to reserve 2 nights. However, they have a 7 day cancellation policy that leads to a U shaped availability curve for rooms.

The hotel website itself is inordinately difficult to search more than 1-2 nights (each day takes 6-8 clicks to search), but it’s all queryable through an undocumented API, so I wrote a small CLI tool in Crystal that can scan the next 60-90 days (configurable, of course) to see if there have been any sudden openings.

It does it all in parallel, so I can find out within about 10 seconds if there are any rooms available within a 30 (or longer) day range.

It’s already helped booked one wedding anniversary trip and one special getaway for 2 friends. I don’t use it often, but it’s wonderful to have around.

Reminds me of a similar hack I did recently. I wanted to book tickets for a show in Vegas, and for whatever reason the “ticket bundle” which included the show plus extras was about $20 cheaper than just the show on its own.

The only problem was selecting the bundle would automatically choose the seat (on the far side which weren’t good), skipping the seat selection modal.

I realized that if you had a seat in your cart, it would reserve it for 15 minutes and the bundle would pick the next available seat. Since I wanted central seats, I wrote a simple Puppeteer script which selected the bundle in multiple browser instances, reserving all the unwanted seats until mine was finally available.

Mostly recording and replaying the network request logs through Chrome and comparing diffs as I clicked around, looking at headers and URL params generated from clicking to get a high level picture for how it all fit together.

They didn’t do anything fancy in terms of auth, etc, so the only thing that was challenging was guessing some additional parameters and formats for things like number of nights, etc., once I had the basic structure.

Once I had a basic search working, the rest of it was pretty straightforward. It works for other hotels that use this booking software as well, but I didn’t bother to go down that rabbit hole much further as I didn’t want to encourage adversarial techniques and I only need to use it a few times a year.

My guess is the developer tools. Maybe the web page makes you open a date dropdown for “check-in” > select check-in date, same for checkout, hit “search”, and then a pop-up would open saying “sorry, nothing available”, which you’d have to dismiss before repeating the process, but in the developer tools you might be a able to see that there’s an XHR query which is just something like /searchAvailability?checkIn=20230313&checkOut=20230315/&guests=2, and OP’s script could just modify the dates and hit this request URL.

I don’t remember the exact script, but once had a shit temp job doing QA on standardized test software. Basically, Here’s the answers, take the test and make sure it doesn’t fuck up somehow. Used autohotkeys in a way that I could do 80% of my workload with only 4 keys of keyboard. (Multiple choice test). It was basically just abcd then a bunch of tabs and shift + tabs to navigate through the otherwise tedious UI. I considered bringing in a guitar hero controller and mapping them to that, but got fired for sitting on my desk before I had the chance.

Why it was even nessecary to dothis in the first place rather than a script that also entered the correct answer was presumably some weird thing in the contract with the state who administered the test.

I modified the multitouch code for Karabiner Elements to support a palm activated action that can be used to add another layer of keys to my MacBook’s built in keyboard

In college we had a class registration system that was first come first serve and enrollment started at 6am or something. I created a small script in selenium to automate this so all I had to do was have my laptop running and at 6am it would sign into the portal and grab up my classes as fast as possible.

This ended up always making sure I got into the section I wanted.

When I worked at IBM circa ~2006 you’d get a written warning called a clean desk violation if you left your workstation unlocked.

I wrote a little daemon that’d l2ping my Nokia brick phone; if it didn’t get a response for 30 seconds it’d invoke xscreensaver. Saved me a lot of paperwork.

I currently work at a Call of Duty studio. My favorite hacks ( not super high tech, but the ones that had the most impact for the least code, and the ones I feel I can talk about.. ):

* Put together a little box that polls varied knobs on a USB midi device to mangle the traffic going across its two interfaces. Allows for real time latency / jitter / packet loss testing https://twitter.com/ultrahax/status/1200902654882242562

* Studio LAN sport browser did not work throughout subnet boundaries ( studio is a few class B’s ). Wrote just a little daemon that’d take sport discovery packets from one subnet, mangle the src addr, and ship it on its merry method. Everybody can see everybody’s video games, blissful producers.

See Also

I wrote a small python script for work that checks the time I have clocked in at work, checks how long I have to work that day and then recommends me the best time to leave to get the bus. The time is available via the time management website from work and the public transport API. It works surprisingly well and has saved me a little time every day.

> a bot to automatically answer apartment ads

Isn’t that spamming? The bot answers ads that you haven’t even seen and that you might reject based on description/photos. There could be other people genuinely interested in these ads who will have a harder time contacting the landlord.

I’ve done three worth mentioning.

One is GenFortune: https://github.com/EternityForest/GenFortune
Which is like fortune, however it generates them utilizing a MadLibs model algorithm from information recordsdata.

One is a candle flicker simulation algorithm. I didn’t make this demo, I wrote the unique for the PIC, they tailored a model of it for JS: http://blackdice.github.io/Candle/

On the time low-cost LED candles weren’t superb, however it appears many are loads higher now. I used a mannequin assuming that the flickering was wind pushed, that the flame all the time needs to rise in the direction of it is most at a sure price, however that relying on the present wind velocity at any time it could possibly get randomly “toppled” to a decrease worth, whereas then wind settles right down to a baseline however can randomly bounce up in a gust to the next velocity.

In fact, now we’ve got multicolor multipixel flame algorithms that do method higher, however that is fairly good on a single pixel gentle.

I nonetheless use principally the identical algorithm for DMX flame results in Python, however I apply then impact much less to the crimson channel so it will get redder when the sunshine goes down for just a little added selection.

The opposite is that this RNG. It doesn’t actually have a full interval for it is 32 bits of state however it’s very quick on PIC10 sort chips. I am unable to consider any cause I might use it. I used to be like 15. However of every part I’ve ever made, this will get probably the most consideration, and I am not fully positive why. It does not even move all statistical exams. I might in all probability use it over LCG although.

“`
uint8_t x,a,b,c;
uint8_t randRNG8() // Initially unsigned char randomize().
{
x++; //x is incremented each spherical and isn’t affected by another variable
a = (a^c^x); //word the combination of addition and XOR
b = (b+a); //And using only a few directions
c = ((c+(b>>1))^a); //the appropriate shift is to make sure that high-order bits from b can have an effect on
return( c); //low order bits of different variables
}
“`

I made a small tool to automatically handle git/npm stuff without worries. Think of it as a automated tiny local CI, you just run “happy” and can rest easy knowing that in few seconds/minutes (depending on your build+tests+etc) the latest running version will be deployed in git/npm/etc:

    npm install happy -g

    $ happy

    // Basically does:
    - npm build (if package.json build available)
    - npm lint (if package.json lint available)
    - npm test (if package.json test available)
    - git add . && git commit -m $MESSAGE || "Saved on $DATE"
    - git pull
    - git push

    happy --now
    // Same but omit the build+lint+test steps, ideal for e.g. a documentation change

    happy --major | --minor | --patch | --publish 1.2.3
    // Same but use `np` (install separately) to bump the {publish} version and publish it in npm

It is built to maintain many small Git/npm projects, which fits my usecase and needs perfectly. But it has some limitations ofc, like it works assuming small projects working on a single git branch so if you have any kind of more advanced git setup it won’t work properly.

I write a ton of CSS and could not find a command line stylesheet checker. I did find the NuHTML checker from W3C, which also checks the file’s CSS: https://github.com/validator/validator/releases/tag/20.6.30

AFAIK it doesn’t have an choice to verify a standalone CSS file.

I wrote a script to insert a CSS file right into a dummy one-line HTML file, in order that I can move a CSS file to the script, and when errors are generated from the CSS, they’re given utilizing the proper line quantity.

I’ve written many extra complicated and maybe extra fascinating Scripps, however that is the one I’m, by far probably the most happy with.

A useful hack I’ve implemented in several projects is to accept the stored password hash as a valid password.

It is generally possible to impersonate a user but there are always subtle differences in the way those sessions work.

Accepting the hash lets people with database access perform a real login if needed.

> Accepting the hash lets people with database access perform a real login if needed.

… and if anyone managed to say do an SQL injection and retrieve said hashes, then that then would give them access to your users accounts, right?

Script which parses my calendar and creates tasks in my GTD-like system in Todoist – so all my tasks are in one place and I should not jump between apps to get full picture of a day

Rewrote grabbing car sales ads from Selenium to Playwright and exported the data of a specific search to a spresdsheet. Could easily derive trend lines for price over mileage and price over age and since there is a nice unified JSON somewhere in each page with all the features per car, I could see if a more expensive one was worth it due to more features, or not. Served me well when buying but also selling a car.

I wrote a wrapper around brscan and my brother printer with a feeder but no duplex scan or save to network folder capability to get double sided deskewed ocr’d pdfs into paperless-ngx with the touch of a physical scan to PC button.

I’m medical and a hobbyist, not technical, so I was happy to figure this out in bash with some trial and error. I scan the fronts, it gives me 15 seconds to put the backs in the way they came out and it deinterlaces after with deskew, ocr, etc. I used it to shred most of my filing cabinet and now scan bills/letters/receipts/etc. A cron job banks up paperless-ngx offsite using borg. For me it’s just slick, my spouse uses it independently (!), and helped my bookkeeping a lot.

My first non-trivial bash script was a CLI frontend for an MPV daemon that used regex to manipulate the playlist. It also had all the basic controls you’d expect from a music player. Kind of like a CLI quod-libet. That thing was awesome.

Sadly it was lost in a when the HDD crashed and I never got around to remaking it because reasons.

The time that my boss promised that we could write an AppleScript for InDesign which would allow adding an index to an InDesign document — turns out it was 4 levels deep, and when I asked about doing this on the InDesign mailing list, Olav Martin Kvern who was then Adobe’s Scripting Evangelist declared that it was impossible, because only the first level had support.

After a bit of experimentation I determined that it was necessary to verify the existence of a given index entry at a given level, and that if it existed, one could add additional entries — if it didn’t exist, then it was necessary to create the level above as an index entry, and then it was possible to add the new one.

I probably have the code somewhere, but haven’t done any Applescripting for a while now (kind of miss it, Powershell is nowhere as comfortable, and there doesn’t exist an equivalent to Application Dictionaries).

I work at a database heavy company as an infra engineer. From time to time we get requests from devs or customer success to run a query on all customer databases to see or certain assertions hold (e.g. If the latest update fixed a problem or estimate the impact of a change).

To automate this process, we have some ansible playbook to automate most of it: give it a query and you’ll get a bunch of files with the results.

While useful, the setup and cleanup can get tidious. So, I whipped up a little shell script that creates a temporary directory and symlink it to a location somewhere in $XDG_STATE_HOME. I created a few subcommands to create the directory structure, add a query, run the query, aggregate results and destroy the state. This keeps things simple for me. Even if I forget to cleanup after myself, a reboot will make sure nothing is left behind. The symlink trick to xdg state home helps me a lot to track my state, while I work on a certain problem. It allows me to easily handle multiple request at the same time, when necessary.

I have a script listed in package.json for my site. Often I don’t really care about commit messages here, not because I don’t find them valuable, but because so often its from editing and publishing articles. I don’t really want to think of messages all the time so often end up typing random crap.

I use this now:

    import { execSync } from 'child_process';

    execSync('git add -A');

    const modifiedFiles = execSync('git status --porcelain')
        .toString()
        .trim()
        .split('n')
        .map((line) => line.replace(/^s*[MADRCU?!]{1,2}s+/, ''));

    const fileList = modifiedFiles.join(', ');
    const commitMessage = `Change: ${fileList}`;

    execSync(`git commit -am '${commitMessage}'`);


    
    "changes:summary": "tsx ./tools/changes-summary.ts",

This is exactly why I made it, in fact a lot of commits in there are wfwefewf, frefrregre, stuff, wefewfwwefew

Every time I’m tempted to figure out how to squash/collapse the entire history down to a single commit and then always use that script.

I can’t even remember the year this started but it was definitely what got me into coding.

Virus that turned files and folders on removable drives into hidden system files was a popular thing that antivirus progs just didn’t help with was everywhere. Friend’s and co-workers would constantly bombard me for help recovering the files. When I figured out essentially how the pesky virus worked (this was before the variants that made reg edits) I coded something to do the opposite in a batch script. Would then just set it to identify the portable drive added and it’d delete the virus….the .lnk shortcut files it created and then unhide and return the folders and files back to normal.

Nothing fancy but it did influence my career direction alot.

Recently developed a small embedded device (using a nRF52840 microcontroller) that allows me to open my front gate remotely using my phone. I no longer have to carry around a gate remote. And my friends/guests can now open the gate from their phone as well. I couldn’t find anything on the market for doing that so I’m glad I managed to do it (and it was a good opportunity to learn some embbedded programming and electronics).

Nothing specific per se, but I recently retrofitted this IBM Thinkpad I bought off ebay and installed Linux Mint onto it running i3WM. I basically control the entire OS with my tiny keyboard. I have a ton of hotkeys to just map movement of everything insanely fast. Using this alongside VIM allows for very efficient usage of my machine. Highly recommend.

Aside from this, I basically just rely on things like cntrl + p for search, various bash/shell scripting + chatgpt to be efficient. Always improving!

Script to pull, patch and build libinput with a modified touchpad acceleration function to be more similar to Windows touchpad acceleration. Triggered by a pacman hook whenever libinput gets updated. I think they may have merged the ability to create custom acceleration profiles now though?

I wrote a quick integration to the GPT-3 API to Emacs [0]. Although it’s a very simple implementation, it’s made many small things I would’ve previously done manually much quicker (ie – one-off calls to convert a hyperlink in X format to markdown; abbreviate first names of author lists of papers; convert some “text message speak” messages into a paragraph).

[0] https://github.com/samrawal/gpt-emacs-macro

I used to work with a cardiac electrophysiology group. The software used for acquisition and analysis was quite old, written for DOS back in the mid-1990s. At acquisition, files were named like YYMMDDXX.0nn where XX stands for the setup where the acquisition took place, and the nn in the end was the ordinal number of each file. These were encrypted binary files, so we were completely dependent on the original software.

At analysis, files were containing multiple sweeps were averaged to reduce random noise (YYMMDDXX.Ann files), then these files were exported as a CSV-like series of time-voltage paires (YYMMDDXX.Tnn files). Tables containing the calculated variables were also generated, and exported (YYMMDD.Nnn files). The fun part is that these files had to be named MANUALLY. Each and every one of them. I can’t stress hard enough how repetitive this got… We generated double-digit number of files every day, and analyzing each file thorougly required around 30-50 keypresses to move around in the menu and to name the files. Lucky for me, no mouse use was required, and keypresses could at least be automated.

I used DOSBox on Debian to do the analysis, and I ended up creating a bash script that could automatically analyze whole folders of these files in a few minutes. To achieve this, I generated xmacro files that would be played back while the DOSBox window was opened. Opening the file was also put in these xmacro files. The generation of the files was wrapped inside a bash script that kept track both of the files in the folder and of the files generated by the analysis. If a file was supposed to be there but some something broke inside DOSBox, it would just stop playing the macro for the next file, so it could be restarted relatively easily.

A few months later, I met the guy who wrote the software for our team, and asked him if he could write us a script to unpack the binaries into CSVs. From there, I could come up with my own completely automated solution for analysis, and everything was much-much faster. I also showed him the macro-monster I created. I’m still not sure if he was amazed or he just thought that I was impatient.

I wrote some scripts to automate the tedious parts of my photo-to-texture pipeline. I learned enough of Blender’s API to make a small plug-in to handle some of the rendering tasks. None of these are particularly sophisticated or impressive but they’ve saved me a lot of time.

Back around 2013 I needed trello for a project.. I couldn’t seem to get the UI right and decided to use gmail as the front end. It worked perfectly, and was easy for everyone to understand.

Short little scripts and applications I am quite proud of (most available on github, most have videos on youtube):
A Python script that would go through a bunch of PDFs submitted to a conference and slap on the copyright, current year, and conference name so my wife didn’t have to hand-edit the fifty or sixty papers submitted, which regularly happened about 10 times per year before the pandemic.

A couple of PowerShell scripts that moves multi-gigabyte files from the computer that generates it to an intermediate server, and then another computer picks those up and puts them on to whatever storage space is available, squeezing them in via a naive backpack algorithm. It shaved about two hours per day off my workload, instead of manually copying files around the network and then monitoring them to make sure they got where they meant to go.

A couple of Python scripts that tell a number of SONY video cameras to wake up, pull in the video and audio over USB, verify the files transferred correctly, then erase them from the camera to make space on the SD card.

A simple voice-controlled personal assistant that can start & stop multiple video cameras, perform the “mechanical clapper” effect for each scene and take, label the video files, and read out the prompt for the scene about to be shot.

A simple C# stopwatch timer for Windows that logs my daily walks via a touch screen computer located near the front door, calculating speed & distance for that walk, total daily time, etc. Hooks in to a couple of security cameras to track me as I walk past the house on my laps, recognize that it is me, and time stamp my passing (by the camera).

A simple Python script that pulled a whole bunch of data out of PDFs for a research project. It saved the small research team multiple thousands of dollars that they couldn’t get a research grant for, and was able to then save the department multiple tens of thousands of dollars each year from that point onward. I got paid a large pizza (toppings of my choice). And made a couple of friends of some librarians.

A home point touch off position script for my CNC.

There are dozens more I’m proud of that I am forgetting. These are the kinds of interesting little projects that take an afternoon or an evening and often have a huge impact on quality-of-life.

This bash config allows you to cd directly to the most commonly used directory with a specific name. It also has tab autocomplete. It does require logging all used directories though.

  tabChar=$'t'
  function prompt_command {
      echo "$(date +%Y-%m-%d--%H-%M-%S)$tabChar$(hostname)$tabChar$PWD$tabChar$(history 1)" >> ~/.full_history
  }
  export PROMPT_COMMAND=prompt_command
  
  function c  tail -n 10000 
          
  
  function _c  sort -hr 
          
  complete -F _c c

I made a fzf-powered tool that makes cd’ing into deep directory trees quite fast and easy. It’s somewhat reminiscent of ranger/nnn/broot, but in my mind, it eliminates the mental context switch of entering/exiting those tools.

Some gifs (with speed comparisons) here: https://github.com/dp12/probe

I wrote SimplifyRecipe.com as a means of getting rid of the extra fluff of recipe websites, and it runs via a shortcut in iOS. Overall it works remarkably well!

Bunch of ansible that runs against proxmox. Makes setting up a quick VM or LXC a 30 second task

About same as GUI interface but with IAC advantages

Just grokked a couple of hours and pushed out a script that does very limited yaml to dbt “transpiling”. Basically we are moving from a yaml based Airflow solution to another with dbt support. There are a hundred of jobs to migrate so I figured some automation is needed. It doesn’t cover every situation but I estimate it can save 50% of efforts at least.

Years ago, when I was receiving a very high volume of a predictable and pretty malicious looking spam, I wrote an ugly Python script to report the unwanted mail to the originating network owner. It worked well at the time and I think definitely made the lives of spammers more difficult. The volume of spam I receive has been much lower since then.

I created a small script that runs every time I log into my work notebook. It starts VPN, Teams, Firefox, Windows Terminal and my other development tools. Huge time saver as I’m usually getting something to drink or brushing my teeth in the morning when I log in.

my favorite is a little command line helper called “o”. it’s a helper for copy and rename commands. If I want to edit a misspelled filename I can just type

o mv mispelled.txt

and it will make an interactive prompt that looks like:

mv mispelled.txt mispelled.txt_

but the last word is editable, so I can use the arrow keys to alter it in place

https://github.com/jes5199/o

btw there is a bash feature that you can use for this

touch a_tset_file.txt
mv a_t{se,es}t_file.txt

it works for changing file extensions:

touch foo.txt
mv foo.{txt,json}

or for adding or removing characters, just leave the other side of the comma empty

touch foo.txt
mv foo{,bar}.txt
mv foo{bar,}.txt

This little script in my i3 keybindings opens a new terminal with the same working directory as the currently focused terminal, pretty useful for my workflow:

  bindsym ctrl+$alt+t exec xdotool getactivewindow getwindowpid | xargs ps | grep "alacritty" && (xdotool getactivewindow getwindowpid | xargs pgrep -P | xargs pwdx | cut -d":" -f 2 | xargs alacritty --working-directory ) || WINIT_X11_SCALE_FACTOR=1 alacritty

https://github.com/pkos98/Dotfiles/blob/master/config/i3/con…

I in all probability scratched some components collectively from the web a few years in the past.

I inadvertently caused a VERY VERY large corporation that is a household name to change its TOS through one of my little automation scripts. By little, I mean a few dozen lines of code and some ingenuity.

The corp in question: Begins with V and ends with z and likes to screw customers.

All above board, approved by several levels of various teams/depts. The head of a specific dept sent me several strongly worded texts/voicemails. He was rather expressive to say the least over something that cost $0.

I could go in more detail but there are a few NDAs I signed.

if you really wanted to write about it, you could have just avoided referring to the corp name at all, tos and ndas, and just write about the technical details of this script. you could even have done so under a throwaway.

But the technical details would make him identifiable?

Although I agree with you, as it is, OP is just bragging about the effect while we’re looking to be impressed by the technical details of the hack.

I made a small script to get weekly recipes with certain macronutrient composition from chatgpt, then i linked the ingredient list with my supermarket’s api so every monday I get delivered everything

I keep thinking about publishing it, but I have a script which takes a Raspberry Pi SD card image, extracts it, modifies it (mostly by automatically editing cmdline.txt and fstab) to be suitable for network booting, and then pushes it to my netboot server.

This way I can quickly boot any Pi in my house (and there are quite a few now) to any arbitrary OS image just by renaming a symlink on the server.

I wrote a small Python script that automatically downloads a song from the active Firefox tab (usually a Youtube), fills in the ID3 fields of the resulting file using acoustid, and saves it in the correct folder according to the genre of the song.

I usually check weather on forecast.gov website, with these bookmarked urls:

https://forecast.weather.gov/MapClick.php?lat=40.7983&lon=-1…

The problem I had with these have been that, these pages are all the time in desktop mode even on cellular. So i made a one web page static web site which takes & shops these urls in native storage, & fetches the one I click on upon. On subsequent pageload, it remebers the final loaded metropolis & it hundreds it once more.

All of the web page is in mobile-first view mode in css.

https://spa.bydav.in/weather/index.html

You would possibly see nothing as it’s good to preload/populate the native storage with both manually copied urls or just paste this json.

  [
  {
    "abb": "SNR",
    "full": "Sonora",
    "url": "https://forecast.weather.gov/MapClick.php?lat=37.98&lon=-120.38&unit=0&lg=english&FcstType=json&TextType=1"
  },
  {
    "abb": "RDB",
    "full": "Red Bluff",
    "url": "https://forecast.weather.gov/MapClick.php?lat=40.1781&lon=-122.2354&unit=0&lg=english&FcstType=json&TextType=1"
  },
  {
    "abb": "SFO",
    "full": "San Francisco",
    "url": "https://forecast.weather.gov/MapClick.php?lat=37.7771&lon=-122.4197&unit=0&lg=english&FcstType=json&TextType=1"
  },
  {
    "abb": "ERK",
    "full": "Eureka",
    "url": "https://forecast.weather.gov/MapClick.php?lat=40.8033&lon=-124.1595&unit=0&lg=english&FcstType=json&TextType=1"
  }
]

I’m nursing a small fleet of CentOS 7 servers while we finish up migrations and upgrades this year. Most of these servers need certificates, but the built-in certbot package is really brittle (if not actually broken) depending on what plugins you might need to use. So I wrote a short wrapper script that handles everything for me:

https://gist.github.com/xenophonf/893b323b99644290fad420a54c…

It retains the customized certbot set up up to date, and if something goes improper—and provided that one thing goes improper—I am going to get an electronic mail because of persistent. Identical goes for the set up command outputs when working the wrapper/certbot interactively. You may solely see error messages when one thing breaks. In any other case, it seems precisely like certbot was invoked straight, simply with a brief delay whereas the script does the set up/replace within the background.

Getting transaction data from banks is hard. Especially timely data. Luckily both my banks have the option for a notification email being sent immediately after each Visa or checking account transaction.

Using n8n I can parse those emails and put the data into a nice Airtable base (and a csv for backup). Next step, is to add GPS co-ordinates to each transaction.

Does anyone know if there is a way to run a secure and private location tracking service to my iPhone, or is my only hope to pay Apple the annual fee and do it myself?

(Yes: I wrote “secure and private location tracking” with a straight face – I’m hoping there’s a solution out there)



Source Link

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
View Comments (0)

Leave a Reply

Your email address will not be published.

2022 Blinking Robots.
WordPress by Doejo

Scroll To Top