Posts tagged "tech"
← All tags
March 2, 2022
In case you have not heard, Spotify sucks. A lot.
Here's a summary of my post-Spotify exploration of music. I'm still trying to decide how I want to consume music in the future, but what I'm currently thinking about is:
- Apple Music for 'all you can eat' music streaming and music discovery
- Accompanied by Plex music library and PlexAmp on phone for music that I want to collect and keep
I'm starting to buy interesting music on Bandcamp, especially newer stuff, new music types and collaboration, things that may never get published as a CD or other record.
I'm using the SF Public Library's incredible music collection (vinyl and CD!) to borrow and listen to older music that I may have missed out on, or music that I want to listen to in higher fidelity, on different equipment.
Moving Spotify playlists to Apple Music or Tidal
#
Using services like TuneMyMusic, I was able to easily move my playlists to Apple Music and Tidal. I evaluated both services before deciding on Apple Music. Tidal did not do a good job recognizing or having access to some of my non-English music. Apple Music did not miss a beat. I'm very firmly entrenched in the Apple walled garden, so it was also a good reason to get Apple One (so everyone in my family can also have Apple Music). If you don't use iOS, you may want to evaluate other alternatives.
I decided to pony up the $5 fee on TuneMyMusic to move my music to Apple Music. It was a one-off action that took a few hours to complete. After that, I canceled my subscription as I have no need to keep my playlists in sync.
There are probably ways to do this cheaply or freely with command line tools, but I did not have time to look into it.
Buying music on Bandcamp
#
I'm lucky to be friends with many music nerds and music lovers who have carefully curated playlists and music collections. Some of them also share their favorite new music on Bandcamp.
I started buying a few albums there. I'm still finding my way around Bandcamp (it looks like I'm buying stuff that I really like, and also experimental stuff I maybe don't like as much, but find interesting enough to keep).
I don't like listening to the albums on the Bandcamp website or app, so it's handy that they let you download lossless files of the music you buy.
Setting up Plex and PlexAmp
#
Since I already had a Plex media server setup, it was simply a matter of setting up a new folder and library for music files.
I learned that the metadata from Bandcamp files isn't the best: Plex does best when you can organize files in a hierarchical Artist / Album folder structure, and for music that may not exist that way (like a lot of digital-only music on Bandcamp), Plex just doesn't pick up the music metadata neatly.
I decided to give Beets a go. The project was mature and many people swore by it. Its documentation was also excellent.
First, I had to set up the config.yaml
file. Here's my config file, in case it helps.
Then, I had to install the right plugins. For my use case, the beetcamp, acousticbrainz and discogs plugins were the most useful.
I successfully re-imported all of my music files into my Plex music library like this:
beet import ~/some-path-to-music
After installing the right plugins, I was able to find matches for all of the music, including some very obscure old stuff. You can even set up the PlexUpdate plugin to let Plex know to update the music library every time music gets imported with beets.
I'm very happy with this setup, and will probably continue to grow my music collection in this way.
May 29, 2021
I have been on a roll of late with my data liberation project.
The last piece in my photo liberation project was to figure out a way to take out all of the data from iCloud. Having been in the Apple walled garden for more than a decade and a half now, I have.. a lot of stuff in there.
Apple's official documentation simply says "log in to icloud.com, select the photos you want and download as a zip". What if you've got tens of thousands, or hundreds of photos like me?
Enter iCloud Photos Downloader, a Python utility that sucks out all of your iCloud photos into wherever you're running it.
In my case, I've already got a Linux server going for my photos so that's where I wanted it. The eventual goal is to put all of the photos into PhotoPrism there, as I like its tagging and deduping functionality. The goal is for all of my photos to eventually live on photos.mydomain.com, which is where all photos are going to.. eventually. Right now, I've only got my Google Photos in there. Time to get my iCloud photos in there as well.
Install iCloud Photos Downloader in your server or other computer
#
In my case, I just did a git clone
of [this repo] into my Linux server. Once downloaded, i cd
-ed into it and ran the following command:
$ pip install icloudpd
$ pip install -r requirements.txt
As with any other pip package, there can be errors because of your Python environment. I ran into a problem with having too many Pythons, and I could not run the ./icloudpd.py
script, which threw a Python module error.
To fix this, I opened icloudpd.py
in a text editor and I edited the first line from: #!/usr/bin/env python
to #!/usr/bin/env python3
. This tool needs Python 3.6+ to run.
Starting the download process
#
On my Linux server, I created a directory for my photos called icloudphotos
.
I then ran the command:
icloudpd --directory ~/icloudphotos \
--username myemail@domain.com \
--password password
The tool will prompt you to login and authenticate to iCloud.
Note: if you have 2FA enabled, you will most likely have to re-authenticate every 90 days or so.
I got tens of thousands of photos as expected. The tool shows you a nice little progress bar with basic information. It ran for several hours (around 5 or 6?) but it really depends on your connection speeds. You can turn off video downloads by using the --skip-videos
option. You can also have it email you when it's done by using the various smtp options, but I did not want to bother with that.
Running icloudpd as a cron script
#
The next step in my workflow will be to run this as a cron script. It looks straightforward enough.
I also have Syncthing set up and I am evaluating which workflow I prefer. I might want to continue keeping a copy of all photos on both iCloud and on PhotoPrism for redundancy.
In any case, I'm glad to have found a non-GUI way to access my iCloud photos. This will make any projects in this category much easier from now on.
May 19, 2021
If you, like me and many others, have started to feel uncomfortable about one company knowing everything about you, moving off the Google ecosystem is the natural first step. There are lots of alternatives for the main features: for search, there is DuckDuckGo, which is improving all the time and has now fully replaced Google search for me. There is Fastmail, Proton Mail and many other alternatives for email. For photos, Google Photos and iCloud Photos reign supreme.
I have attempted over the last couple of years to move off Google Photos. Each time, I've been let down by problems in bandwidth and download speeds. If you have vast amounts of data, it can get very difficult to work with the raw data you obtain from Google Photos using a graphical user interface. Each time I've tried to do that I've ended up with corrupted files or incomplete data.
For this reason, I eventually designed this plan.
PhotoPrism in Docker Compose
#
With a reverse proxy into a photos.mydomain.com address and https.
To be honest, while I know my way around servers I don't have a lot of experience with containers, networking or security. I did not want to attempt this project until I succeeded in getting a beginner's version of all that up online.
Choice of self-hosted photo software. I looked mostly at PhotoPrism and PhotoStructure. Both of these projects appeared closest to the sort of self-hosted Google Photos-esque application I was looking for. Many other photo projects are far closer to web 1.0 style web galleries. In my case, I had more than a quarter of a million photos and videos strewn across multiple clouds. I have ADHD, and it has been very difficult for me to organize things.. anything.
Hardware. I decided that I wanted to lease a server in Europe, because there are very good deals to be had there. Hetzner, OVH and an assortment of related companies like SoYouStart, Kimsufi, I've used most of them at various times in the past. It's relatively affordable to get up and running on a server run by any of those companies using used or old parts. For the most part it works out cheaper than trying to own your own hardware right now (in the midst of a global chip and memory shortage). Many people certainly do this sort of work on a NAS or a Raspberry Pi, but I knew I wanted something with many more cores. I got a Xeon E3 server to start, but may upgrade later. $27 a month is not a bad deal at all for a dedicated server with those speces (16GB RAM, relatively decent uplink).
Source of data, and download method. As mentioned previously, I have not had much luck with retrieving my data from Google in the past. This time, I decided to completely avoid downloading my data to local storage, knowing that even with decent desktops and laptops I would still struggle with handling all of this data. I decided to download the backup files directly into my server instead. I decided to do a Google Takeout of all of my Google Photos from my G-Suite domains (several!) and my personal account's Google Photos. You can do the same by going to the Takeout page. I decided to send Takeout data directly into OneDrive, where I have a temporary premium account solely for this purpose. I've noticed I can fetch data from OneDrive at very high speeds using rclone, at least 2-3x faster than from Dropbox or Google Drive.
Rclone, a fantastic tool I can't live without. I have been a huge fan of Rclone for a while now. While it works amazingly well for Google Drive and Dropbox, there are known limitations with rclone for extracting and moving Google Photos that I did not want to deal with. Mainly, using rclone for this purpose strips EXIF data, a known limitation of Google Photos' API.
When my Google Takeout is complete, I rclone to download from OneDrive into my server.
rclone copy onedrive: servername:/home/username/destination -P
For a 200GB backup of my Google Photos, Takeout gave me 4 files that were 50GB each. That took rclone around a few minutes to transfer at 80-100MB/s.
I then unpacked all of the files into a single folder:
cat *.tgz | tar zxvf - -i
That gave me a single folder of all of my photos in a folder named Takeout.
Installing PhotoPrism using Docker Compose. The official Docker Compose instructions are pretty easy to follow. For reference, here's my docker-compose.yml file.
Accessing your photos using a reverse proxy For security, you don't want to access your self-hosted photos at SOME.IP.XX.XX:PORTNO
or yourdomain.com:portno
. You'll want to access it at a domain, preferably one you own. This was the hardest part for me: there are many ways to get a reverse proxy going, and I didn't know very much about all of that.
I decided to use LSIO's swag container. In a nutshell, LSIO provides very well-maintained Docker images for many popular homelab projects. You can easily stand up a wiki, a PVR, or even niche things like a self-hosted Markdown editor. I've used many of their images in other projects and I love how easy it is, how helpful the community is. The swag container was the one I spent the most time on.
It's helpful to read the docs and initial setup info. Once you figure out the ins and outs of how things are set up in this container, you can easily get https://yourdomain.com
, https://anysubdomain.yourdomain.com
or even https://yourdomain.com/subfolder
up and running. Of all of the 'beginner' methods of learning to set up services with reverse proxies (and there are many: you can use Traefik, Caddy, docker gen, etc), this wound up being the one I felt I learned most quickly.
In summary, you want to:
- Set up DNS
- Get an SSL certificate for all your domains and subdomains
- Edit the proxy configuration files
Read the docs, or ask for help; it took me, someone with not a whole lot of infrastructure experience but who knows a bit of Linux, a couple of days to set it up correctly.
The swag container has many built-in templates that makes this easy, once you learn its quirks.

Photoprism has Tensorflow built in. Their pre-trained model doesn't get everything right (for example, it marked a plate of squid as 'baby'), but it is pretty good. My wife is placing a bet with me that I probably have more than 20 000 photos of Cookie. The moment of truth will probably be in a day or so, when all 200 000 ish photos I've got (over the last 20 years) are finally imported, indexed and tagged.
I was able to set up PhotoPrism in Docker Compose in this manner, and access it at https://photos.mydomain.com
. While I'm currently importing and indexing a quarter of a million photos, I've been happy with the speed, performance and features and have decided to sponsor the project. It's nice to see people working on useful software that works well and looks good.
I'm pretty happy with the progress I've made on this. I might make a tutorial for the more complex parts of this project later.
January 10, 2020
It feels like we all just woke up from a collective dream. The dream of the '10s, where we gave our content, perhaps even our personalities, away for free to Facebook.
No longer.
Not only have I cut that toxic company out of my life, I have also started thinking about how web 1.0 got it right: writing on the web, for yourself, with no ads, for free, with a tech stack you control... really was all that.
I don't have resolutions. I don't have aspirations towards goals I won't reach. I don't have diet-related, or gym-related thoughts; exercise has slowly become part of my life again,and I'm thankful for that.
This year, I am taking the opposite route. Instead of doing new things, and becoming a new person, I am going to get really good at doing things I already know and love. Having dabbled in so many hobbies in the past, there are plenty of options to pick. I've quite enjoyed the heads-down learning over the last couple of months, and am looking forward to more.
The one thing that is new is the city I live in. From 2018 I have been living in a new city, San Francisco. I used to visit often, so it's not new-new, but it's new in that I live here with my wife, Sabrena, and get to experience it somewhat differently as a result. We're exactly where we need to be for now as a newly married queer couple, even though we hadn't planned on coming here. It's too bad we're both from countries that don't recognize our marriage.
I have been preoccupied with trying not to lead a conventional tech worker's life in San Francisco. It's so easy to fall into that trap of always-on, tech-enabled convenience. I find that if you do that, the city becomes much less diverse. I want to meet people, build relationships, be part of communities and be part of scenes outside of tech.
As you know, it's so much harder to make friends as an adult. So old hobbies have come in handy. I have been playing music again, casually, but perhaps later performatively. I have been exploring Tibetan Buddhism. I am pushing myself to do things, like bike camping and hiking, that would force me to meet new people and explore new places.
In many others, this new year is just like many others. But I know now that health, family and happiness comes first.
That should count for something.
April 11, 2017
Like so many people who grew up with the Internet, there have been many incarnations of my online self. To some, I will forever be the queer blogger who started writing about the lesbian experience as a teenager in Singapore in the early 2000s. Some find that courageous; I found it much more difficult to change pronouns than to pretend to be someone I was not. To others, I am a travel blogger who enjoys hiking across Asia on trains, bikes and boats. That is made possible by a blend of courage and stupidity, and it has served me well.
Read the rest of this post here.
October 18, 2016
If this looks bare to you, it's supposed to be.
I've just finished archiving all of my old posts and giving them some new life as something else they're not: cool.
By using Jekyll and Github Pages, this setup lets me edit the site in a way I must prefer now: with a text editor and git
.
Most things are still here, and I'll add to it shortly.
But sometimes less is more.
If you're into that sort of thing, check out my repo!
October 1, 2016
I've started writing articles for Brink, a new media publication by the same people behind the Atlantic. My first piece is on Tech’s Role in Reaching Indonesia’s Rising Middle Class.
March 24, 2016
Fuck Medium. Seriously.
I have had enough of their terrible user interface, narrow writing experience, and the empty platitudes of ‘recs' and comments from people looking to improve their lives by reading inspiring content from people they don't care about. Worst of all? I hate people whining about Millennials more than Millennials themselves.
I started this site precisely so that I could tinker around under the hood, and that's what I've missed — tinkering. Writing. Slapping together bits of random code you find on the internet (now forking random folks' code on Github) and hoping it would work. I know a lot more about code and development processes now, but I still gain a huge amount of happiness from tinkering with things I don't know.
My archives are in a mess. I stopped writing here some time circa 2010. I don't know why. Life took over. I got lazy. I got fed up trying to do everything at once.
It might take some time to gather the things I posted on different parts of the web. But it should be worth it 🙂
July 18, 2013
- I got a Battlestar Galactica tattoo
- I'm pretty pleased about that
- It's one half of the pair of wings and Caprica constellation that Starbuck gets when she marries Anders
- Within a couple of hours of getting it, a random stranger proposed to me — saying she would get the other side
- Which would be romantic, but that would also mean (a) she's a Cylon (b) we'd have a tumultuous relationship (c) she'd better be damn good at Pyramids
- It's unbelievable that 10 years has passed since I first started to watch this show
- I don't generally fan-girl anything, but this was special
- I identify with Starbuck in far too many ways, if you know what I mean
- It's potentially far more meaningful than anything else I could have gotten
- After getting this done I do kinda feel like nothing can frak with my qi
I love BSG.