Developer Advocate at Canonical working on Snapcraft & Ubuntu, podcaster, father, separated, cat lover & a geek through and through. Apparently a script kiddie.
4217 words
https://popey.com/@popey

Straightforward Linux Backups with rsnapshot

I hang around in technical support back-alleys. All too often a new person turns up asking for urgent help. Their system is catastrophically broken and they have no easy way to fix it. With a bit of help they can usually come to a fork in the road. Do they wipe and re-install, or keep fighting with the computer to get it working. It's a knowledge, time, effort and convenience trade-off as old as technology itself.

One question we often ask the patient is "Do you have backups?". It's a simple, innocuous question. But if they don't then it closes off one avenue, and focuses the attention on fixing the problem rather than nuking & paving, then restoring data. All too often the person with the problems says "no", they have no backups at all. I wish more people did back their systems up, but so many don't bother. I guess that's partly why "cloud" document editing, and "cloud" photo storage are so popular these days. Who needs backups when you can just throw the computer away, login from a new one and see all your data. It's a compelling convenience.

For those of us who actually store stuff on their PC though, backups are a Good Idea. Here's what I do, which may be useful to others. 

Hardware

My desktop PC has an external hard disk attached. I have an Inatek USB 3 dual-drive bay into which I lobbed a 4TiB hard disk. It's attached over a standard USB cable which is supplied with the bay. It's powered on all the time, and sits quietly on a shelf near my computer. 

I formatted the 4TiB disk using the "Disk Utility" on Ubuntu using the ext4 filesystem. Other filesystems and formatting tools are available, but this works, and I know the ext4 filesystem can be read by most things. So if this machine catastrophically fails, I can get to the data by plugging the drive into basically anything.

I have the partition volume label as "backup" so when it's attached it shows up under /media/alan/backup. I like that it mounts under a human-readable name, and I can browse around it using the file manager.

Software

I'm running Ubuntu Groovy Gorilla (20.10 soon) on my main machine. But this guide should work for any common Linux distro. Install rsnapshot from your archives. Rsnapshot is a rather neat backup system. It keeps a configurable number of old backups going back hours/days/weeks/months.

The thing to note is that rsnapshot doesn't make entire backups each time, only what's changed. It uses rsync under the covers, and uses hard links to avoid lots of file duplication. So you can end up with lots of backups where each successive one is only a small (or large) delta on top of the previous one. It's super clever, and means you can easily browse the directories in a file manager to find the file you need, from any of the historical backups. 

Configuration

Once installed it needs a little configuration, but not much. The main thing to edit is /etc/rsnapshot.conf. Note though, that for some reason it requires tabs and not spaces between parameters. No, multiple spaces won't do, they must be tabs. It's somewhat annoying but you get used to it after editing the file and finding your backups don't run anymore :)

I typically set the following:

This is the main important one, where are the backups gonna go. My hostname is robot, so I put them in this handy folder. 

snapshot_root /media/alan/backup/robot-rsnapshot

no_create_root will ensure that if your disk becomes unmounted for some reason, then rsnapshot doesn't fill your internal disk up with a backup under the expected mount point. Been bitten by that a few times.

no_create_root 1

Retain is where you define how many and of which level backup you want to keep. These used to be called hourly, daily, weekly, monthly in older releases of rsnapshot, but were changed to alpha, beta, gamma, delta because the time interval between backups is arbitrary, so they were a bit meaningless.

So setting this means I end up with 6 backups which were done most recently, then one of those gets hived off to become the beta backup, which it keeps 7 of, one of which gets hived off as gamma, and so on. This means you have multiple alpha backups fairly recently, multiple older beta backups representing days then gamma representing weeks and delta backups going back months.

retain alpha 6
retain beta 7
retain gamma 4
retain delta 3

This makes way more sense if I just show you the result of this on disk. Here you can see the alpha backups were done over the last 24 hours. The alpha.5 backup was at some point made into beta.0 and all the beta backups were shuffled up one. Then at some point beta.6 became gamma.0 and the other gammas were rotated up with the oldest one becoming delta.0 (when that finally happens). If we come back in a year, I'll have a few delta backups going back a while.

alan@robot:~$ sudo ls -ltr /media/alan/backup/robot-rsnapshot/
total 68
drwxr-xr-x 3 root root 4096 Aug 23 04:00 gamma.3
drwxr-xr-x 3 root root 4096 Aug 30 04:02 gamma.2
drwxr-xr-x 3 root root 4096 Sep 6 04:00 gamma.1
drwxr-xr-x 3 root root 4096 Sep 13 04:01 gamma.0
drwxr-xr-x 3 root root 4096 Sep 16 04:07 beta.6
drwxr-xr-x 3 root root 4096 Sep 17 04:00 beta.5
drwxr-xr-x 3 root root 4096 Sep 18 04:00 beta.4
drwxr-xr-x 3 root root 4096 Sep 19 04:00 beta.3
drwxr-xr-x 3 root root 4096 Sep 20 00:01 beta.2
drwxr-xr-x 3 root root 4096 Sep 21 04:06 beta.1
drwxr-xr-x 3 root root 4096 Sep 22 08:00 beta.0
drwxr-xr-x 3 root root 4096 Sep 22 15:18 alpha.5
drwxr-xr-x 3 root root 4096 Sep 22 16:05 alpha.4
drwxr-xr-x 3 root root 4096 Sep 22 20:03 alpha.3
drwxr-xr-x 3 root root 4096 Sep 23 00:04 alpha.2
drwxr-xr-x 3 root root 4096 Sep 23 04:00 alpha.1
drwxr-xr-x 3 root root 4096 Sep 23 08:01 alpha.0

Finally the bit which tells rsnapshot what to backup. This is just a list of folders and where they get put in each of the folders above. These are the defaults, to which you can add folders which are important to you. I really only deeply care about my data in home and configuration in etc.

backup /home/ localhost/
backup /etc/ localhost/
backup /usr/local/ localhost/

Here's what that looks like on disk:

alan@robot:~$ sudo ls -ltr /media/alan/backup/robot-rsnapshot/alpha.0/localhost
total 20
drwxr-xr-x 3 root root 4096 Jul 31 17:28 usr
drwxr-xr-x 5 root root 4096 Aug 15 19:32 home
drwxr-xr-x 145 root root 12288 Sep 22 20:15 etc

There's plenty of other options, but really the only two I change are the first ones, and leave the rest as the default. 

Rsnapshot has a configtest option which validates the configuration:

alan@robot:~$ sudo rsnapshot configtest
Syntax OK

Testing configuration

It's a good idea to run a test backup to make sure your configuration is all correct. The easiest way to do that is just like this:

sudo rsnapshot -v alpha

This will verbosely output to the terminal what it's doing while it runs. The first backup may take a long while, depending on how much data you have, how fast your disks are etc.

Schedule

While it's possible to run rsnapshot manually. Most of the time you want to have a scheduled backup. I just use a cron job in the root account. To set that up, either run this:

sudo crontab -e

OR

Create /etc/cron.d/rsnapshot (which may already exist, depending on version and packaging of rsnapshot).

Here's what I paste in, or what was already pasted in.

0 */4 * * * root /usr/bin/rsnapshot alpha
30 3 * * * root /usr/bin/rsnapshot beta
0 3 * * 1 root /usr/bin/rsnapshot gamma
30 2 1 * * root /usr/bin/rsnapshot delta

Every day, four times a day, the alpha backup is run. This is what actually does a backup. The rest technically don't.

Every day, at 3:30AM, a beta backup is run - which moves all the beta backups up one, and moves the oldest alpha to beta.0

Every Monday at 3:00AM a gamma backup is run - which moves all the gamma backups up one, and moves the oldest beta to gamma.0

Every 1st of the month at 2:30AM a delta backup is run - which moves all the delta backups up one, and takes the oldest gamma to become delta.0

Here's some extracts from my logs showing this happening:

alpha:

[2020-09-23T08:00:01] /usr/bin/rsnapshot alpha: started
[2020-09-23T08:00:01] echo 3366612 > /var/run/rsnapshot.pid
[2020-09-23T08:00:01] /bin/rm -rf /media/alan/backup/robot-rsnapshot/alpha.5/
[2020-09-23T08:01:15] mv /media/alan/backup/robot-rsnapshot/alpha.4/ /media/alan/backup/robot-rsnapshot/alpha.5/
[2020-09-23T08:01:15] mv /media/alan/backup/robot-rsnapshot/alpha.3/ /media/alan/backup/robot-rsnapshot/alpha.4/
[2020-09-23T08:01:15] mv /media/alan/backup/robot-rsnapshot/alpha.2/ /media/alan/backup/robot-rsnapshot/alpha.3/
[2020-09-23T08:01:15] mv /media/alan/backup/robot-rsnapshot/alpha.1/ /media/alan/backup/robot-rsnapshot/alpha.2/
[2020-09-23T08:01:15] /bin/cp -al /media/alan/backup/robot-rsnapshot/alpha.0 /media/alan/backup/robot-rsnapshot/alpha.1
[2020-09-23T08:01:42] /usr/bin/rsync -a --delete --numeric-ids --relative --delete-excluded /home/ /media/alan/backup/robot-rsnapshot/alpha.0/localhost/
[2020-09-23T08:01:53] /usr/bin/rsync -a --delete --numeric-ids --relative --delete-excluded /etc/ /media/alan/backup/robot-rsnapshot/alpha.0/localhost/
[2020-09-23T08:01:53] /usr/bin/rsync -a --delete --numeric-ids --relative --delete-excluded /usr/local/ /media/alan/backup/robot-rsnapshot/alpha.0/localhost/
[2020-09-23T08:01:53] touch /media/alan/backup/robot-rsnapshot/alpha.0/
[2020-09-23T08:01:53] rm -f /var/run/rsnapshot.pid
[2020-09-23T08:01:53] /usr/bin/rsnapshot alpha: completed successfully

beta:

[2020-09-21T03:30:01] /usr/bin/rsnapshot beta: started
[2020-09-21T03:30:01] echo 305636 > /var/run/rsnapshot.pid
[2020-09-21T03:30:02] mv /media/alan/backup/robot-rsnapshot/beta.5/ /media/alan/backup/robot-rsnapshot/beta.6/
[2020-09-21T03:30:02] mv /media/alan/backup/robot-rsnapshot/beta.4/ /media/alan/backup/robot-rsnapshot/beta.5/
[2020-09-21T03:30:02] mv /media/alan/backup/robot-rsnapshot/beta.3/ /media/alan/backup/robot-rsnapshot/beta.4/
[2020-09-21T03:30:02] mv /media/alan/backup/robot-rsnapshot/beta.2/ /media/alan/backup/robot-rsnapshot/beta.3/
[2020-09-21T03:30:02] mv /media/alan/backup/robot-rsnapshot/beta.1/ /media/alan/backup/robot-rsnapshot/beta.2/
[2020-09-21T03:30:02] mv /media/alan/backup/robot-rsnapshot/beta.0/ /media/alan/backup/robot-rsnapshot/beta.1/
[2020-09-21T03:30:02] mv /media/alan/backup/robot-rsnapshot/alpha.5/ /media/alan/backup/robot-rsnapshot/beta.0/
[2020-09-21T03:30:02] rm -f /var/run/rsnapshot.pid
[2020-09-21T03:30:02] /usr/bin/rsnapshot beta: completed successfully

gamma:

[2020-09-21T03:00:01] /usr/bin/rsnapshot gamma: started
[2020-09-21T03:00:01] echo 273151 > /var/run/rsnapshot.pid
[2020-09-21T03:00:02] mv /media/alan/backup/robot-rsnapshot/gamma.2/ /media/alan/backup/robot-rsnapshot/gamma.3/
[2020-09-21T03:00:02] mv /media/alan/backup/robot-rsnapshot/gamma.1/ /media/alan/backup/robot-rsnapshot/gamma.2/
[2020-09-21T03:00:02] mv /media/alan/backup/robot-rsnapshot/gamma.0/ /media/alan/backup/robot-rsnapshot/gamma.1/
[2020-09-21T03:00:02] mv /media/alan/backup/robot-rsnapshot/beta.6/ /media/alan/backup/robot-rsnapshot/gamma.0/
[2020-09-21T03:00:02] rm -f /var/run/rsnapshot.pid
[2020-09-21T03:00:02] /usr/bin/rsnapshot gamma: completed successfully

delta:

[2020-09-01T02:30:01] /usr/bin/rsnapshot delta: started
[2020-09-01T02:30:01] echo 1424983 > /var/run/rsnapshot.pid
[2020-09-01T02:30:01] /media/alan/backup/robot-rsnapshot/gamma.3 not present (yet), nothing to copy
[2020-09-01T02:30:01] rm -f /var/run/rsnapshot.pid
[2020-09-01T02:30:01] /usr/bin/rsnapshot delta: completed successfully

Conclusion

It probably all looks more complex than it really is. In essence you just need to have a drive to backup to, point rsnapshot to it, run rsnapshot regularly and check the log file periodically to make sure it worked. This has saved my bacon a few times. I can rummage around on the disk to find files I accidentally (or intentionally) deleted, in backups dating back weeks or even months. If I nuke my machine I can be confident I have copies of my important files right at hand.

Hope that helps someone!

Synergism Devblog Week 1

In my previous blog post, Committing to a Project, I talked about getting started building something. Well, I rummaged through the Vault of Unfinished Ideas and rebooted it. Future posts I make on this subject will likely be shorter, and more bullet-pointy. This one is more of an introduction.

As so often happens, a while back I pinged my good friend Stuart Langridge to get a sanity check on a game idea. I sometimes ping him with ideas, and he does the same with me. Our Telegram chats frequently start with "Hey, here's a game idea I just had". We then discuss it over minutes, hours or days, bouncing ideas back and forth. Finally we squirrel the idea away in The Vault to gather dust.

This one dates back from May 2018. I've dusted it off. It's going to be a game called "Synergism". Thanks to Stuart for the name!

It's an online multiplayer game, which is ambitious, for me. I'm using Construct 3 to develop it. Yes, I could probably have used some other toolkit, language or framework. But I didn't. There's a ton of reasons for choosing Construct 3. It's a "cloud based" development environment. So no matter what computer I'm sat at, I can just launch a browser tab to https://construct.net/ and continue working. 

Construct 3 generates HTML5 & Nwjs based builds for numerous platforms including the main desktop environments, Windows, MacOS and Linux. It can also create builds for mobile platforms and the web. I'd like to make this game as widely accessible with minimal platform-specific changes as possible. Construct enables me do that. 

Construct is also super easy to use. I'm "Not a Developer" but have enough coding "skills" to knock a game together. However I don't want to have to code everything. I like that Construct abstracts away physics, OpenGL, audio subsystems, tilemaps, sprite layering and all those other trixty things.

But we're getting ahead of ourselves. I started work on this a week ago, and have managed so far to knock together a very simple title screen and lobby area. This won't be pretty initially as I'm just trying to make it all work. So if you see any prototypes or demos linked from here, expect them to function, maybe, but look like hot garbage :D

I appreciate I haven't described the game at all at this point. That'll come. It's all a bit up in the air right now. I'm kinda designing as I go. I often fall into a trap of planning everything out too much and not actually getting anything done - see previous blog posts on this subject. This time I've mostly just dived in, followed a couple of tutorials and hacked about some code.

Here's what I have done so far.

  • Start new Construct project
  • Configure saving to the cloud
  • Create layouts for Title Screen, Lobby and Game
  • Create a super basic title screen using a large word and "pretty" font
  • Build Lobby screen 
    • Accept a player name
    • Allow player to Host or Join a game
    • Offer a game code (to share if hosting) or allow picking a game code if joining an existing game
  • Generate Web, and Desktop Linux builds to test
  • Upload game to https://popey.com/synergism

Next up I need to register each player who joins and then allow the players to actually start the game. Then, you know, develop the actual game. Tune in next week to see how far I get! :D


Committing to a Project

In my last blog post titled Paralysis of Choice I talked about my problem with committing to a personal project. I had a bunch of lovely feedback, some via the Twitter thread where I mentioned the post. A lot of it was from people telling me they feel the same way, or that it reminded them of other similar posts. 

So I'm not alone! Good. Phew!

A few days later Liam Dawe, my Internet friend from GamingOnLinux tweeted out "Tempted to make a game. Perhaps a wave based shooter, with some cool effects. Sounds like a good starting point?". This piqued my interest. Liam then says "I've wanted to for years, so now I'm just going to fucking do it. Even if it's shit, I did something."

Can relate!

So I piled in with the suggestion I was "Thinking we should both do something (separately) to help motivate eachother?".

This felt like a good plan! We could both work on our own little projects, in our own time, and provide updates and motivation to eachother when we've made some achievements. I liked this idea. I will be more motivated to know someone else is working on something too. 

Skip forward a week or so and Liam taunts me with "started your game yet? 😇"!

Yikes. All I've done is think this was a good idea. I'd not written any code, or even thought about what kind of game I want to write. But wait, this is supposed to be fun, a distraction, and something I can do in my spare time. I should go easier on myself, while still committing to make something, whatever that is. 

Today I had a chat with my good friend Stuart, who always encourages me to create stuff. I mentioned I was 'falling behind' with the 'Liam Challenge'. Stuart and I have a meeting later on this evening about something else. He has challenged me to have something done by the discussion we're having later this evening. Thankfully the kids are out, which gives me a whole Saturday afternoon/evening to myself. 

So I've rummaged through my Vault of Unfinished Ideas and found one I have been meaning to write for a few years now. Now it's a Small Matter Of Programming. I'll be back when I have something to show. Cheerio!

Paralysis of Choice

On the most recent episode of The New Show (a podcast I do with Daniel Fore and Joe Ressington), one of the listener questions was the following:

Have you ever struggled with the paralysis of choice, especially when deciding on a tool, service or product to use, tech or otherwise?

You can hear the discussion of this in episode 10 from around 25 mins in.

Initially my thoughts revolved around consumption. I have no real difficulty choosing products as a consumer. I walk into a store, find a thing, buy the thing and walk out. I rarely browse, unless it's a shop where I have no intended purchase, like on a recent trip to a game shop or comic store while on holiday.

After Joe talked about his experience of the paralysis of choice when he's creating content, it got me thinking. I have a problem with content consumption, in that I find there's too much choice (Netflix, Spotify, Podcasts) which leads me to choosing the easy option of listening to or watching the same stuff repeatedly.

However, thinking more, I also have the same content creation paralysis too. Mostly around game and application development. Anyone who's heard me online in the last year will know I don't consider myself a "Developer" despite my job title being "Developer Advocate". I can advocate for the tools other people develop, but I don't technically develop them myself, is how I rationalise it.

But in my spare time, I often consider ideas for games and applications. I have physical notebooks and electronic documents going back maybe 15 years, containing ideas, sketches and plans for games and applications - mostly games. Over the years I've made a hundred aborted attempts to turn these ideas into real, tangible things. The problem I think I have, is choice.

As I'm "not a developer" I've not had any formal training or education on software development, at least not for over 20 years. Back in the late 1980's at school and college I learned a bunch of languages like InfoBASIC, Z80 Assembler and even COBOL. I later taught myself Pascal and dabbled with 8088 Assembler on the PC. 

Through my work career I've fiddled with a bunch of languages including SAP ABAP/4, Python and Java, but never to a point where I could plan and develop an application from start to finish. More recently I've played with Godot, Unity, Construct 2 & 3, Phaser, Lua, Love2D and more I've probably forgotten.

There's (at least) two issues here. Firstly, because I don't know any of the tools particularly well, I frequently get stuck very early on. I might choose a toolset / framework, bootstrap a project and then get frustrated because I don't know the language or libraries well enough. I will put that to one side, and maybe come back to it later, but often not.

The other problem is that of choice. Because I have a high level, but surface-deep idea of a bunch of languages and tools, but not in-depth knowledge, I can't pick one. I try one, get frustrated, and go back to the start. The next time around I remember the frustration, and pick a different tool, thinking things will be better. They aren't. 

Part of the problem is not doing this as my day job - not a developer, remember - so not having a bunch of solid experience to draw from. Another part is that I often only have a couple of hours here and there to hack on these projects in between work, looking after the family, and the house. So I never get deep into any of the projects before either running out of time or getting frustrated.

In the past, one solution I used was to pay someone else to do the work for me. I designed the application, wrote a spec and they did the actual coding. I can understand the way it all works once someone else wrote it, but actually starting from a blank page and writing the code, seems like something my brain isn't wired for.

Am I alone? Other people get this? What's the way out? Help!

Counting to 100 Million

About 10-15 years ago, back in the heady days of Hampshire Linux User Group, we had a Wiki. It ran a heavily patched version of UseModWiki that we'd modified to add anti-spam and anti-abuse protection. We'd affectionately called it "AbuseMod". It's still kinda there, but I don't think the content is ever touched.

We used it to co-ordinate meetings, take notes, and some other fun sillyness. One such fun was Hugo's Random Benchmark (Note: Not a benchmark). It was a single line we'd each run on our computers to see whose was fastest (Note: Again, not a good benchmark). It did this by counting to 100 Million in Perl. It's a super simple single-line shell script which just times how long the computer takes for perl to go from 1 to 1e8 (100 million).

Here's the "script":

time perl -e 'for($i=0;$i<1e8;$i++) { }'

Here's the resulting output, as produced on a typical Linux system:

real 0m2.868s
user 0m2.828s
sys 0m0.016s


Again, not a benchmark. It's a single threaded count, so typically won't get any faster if you had a dual core, or many core system. But back in those days, a lot of the systems only had one anyway, so the point is moot. It also doesn't "measure" any other part of the system. It's fun though.

Over time we'd add our own systems to the table on the page. Some (such as myself) would strive to run the "benchmark" on ever faster systems. Others aimed for the bottom of the table, and some went for esoteric or imaginary systems.

With each newer system that was measured, the amount of time shrank a tiny bit. Getting from double-digit times to single-digit was a milestone. Further reducing the total number of seconds by a little with each update. 

The page hasn't really been updated for over 10 years now. It was fun at the time, but many of those people have moved on from the LUG, and the site isn't super accessible to edit anymore. It's a little sad, but I do sometimes still go back and grab the script and run it on modern systems, just to see if it's got any quicker.

The output above came from an Intel i7-6820HK running Ubuntu 20.04 under WSL in Windows 10. I don't think back then I'd have envisaged running the benchmark on a system like this. 

I do wonder what the fastest time we can get out of the not-a-benchmark in 2020 might be? Can you get it under a second?

Multiple GPUs in a Skull Canyon NUC

Every 3 years at Canonical we get a laptop refresh fund. I used my last one to buy a ThinkPad T450. My next one is due in November this year. I was considering replacing the ThinkPad with a desktop computer of some kind. I can certainly keep the T450 for portable work, but I mostly sit at the same desk all day, so figure I may as well get a desktop rather than a laptop.

I recently mentioned to my friend and colleague - Martin Wimpress, that my ThinkPad T450 was becoming a bit long in the tooth. There's a Linux Kernel bug somewhere which causes the GPU to lock up randomly when driving 3 displays (internal + two external monitors) which is annoying when you are trying to work on it. It's also struggling to cope when doing a bit of heavy video work - such as having multi-party video meetings, while other things are happening.

On hearing this Martin told me he had a "spare" desktop I could use, as it's sat gathering dust at his place. Once he explained the specs and what I'd be borrowing, I jumped at it. So now I'm typing this blog post on an Intel Skull Canyon NUC. It's a bit nice! I've never used a computer with an illuminated skull on the lid before :D

The interesting thing about this computer is it features a combined Intel CPU/GPU and AMD GPU on one board. The Intel i7-8809G sports a 4-core 8-thread 3.1GHz main CPU and HD Graphics 630 on-board GPU. However, it's bundled with a discrete AMD Radeon RX Vega GPU too. 

What's even more neat is the NUC has a ThunderBolt port which means I can attach an external GPU should I wish. I wish, so I have. The only GPU I had kicking around was an nVidia GeForce GTX 960 (a hand-me-down from another PC which was recently upgraded to an nVidia 1050Ti).

So now this computer technically has 3 GPUs, Intel, AMD and nVidia.

alan@robot:~$ sudo lshw -C display
*-display
description: VGA compatible controller
product: Polaris 22 XT [Radeon RX Vega M GH]
vendor: Advanced Micro Devices, Inc. [AMD/ATI]

physical id: 0
bus info: pci@0000:01:00.0
logical name: /dev/fb0
version: c0
width: 64 bits
clock: 33MHz
capabilities: pm pciexpress msi vga_controller bus_master cap_list rom fb
configuration: depth=32 driver=amdgpu latency=0 mode=1920x1080 visual=truecolor xres=1920 yres=1080
resources: iomemory:200-1ff iomemory:210-20f irq:191 memory:2000000000-20ffffffff memory:2100000000-21001fffff ioport:e000(size=256) memory:db500000-db53ffff memory:c0000-dffff
*-display
description: Display controller
product: HD Graphics 630
vendor: Intel Corporation
physical id: 2
bus info: pci@0000:00:02.0
version: 04
width: 64 bits
clock: 33MHz
capabilities: pciexpress msi pm bus_master cap_list
configuration: driver=i915 latency=0
resources: iomemory:2f0-2ef iomemory:2f0-2ef irq:188 memory:2ffe000000-2ffeffffff memory:2fa0000000-2fafffffff ioport:f000(size=64)
*-display
description: VGA compatible controller
product: GM206 [GeForce GTX 960]
vendor: NVIDIA Corporation
physical id: 0
bus info: pci@0000:40:00.0
version: a1
width: 64 bits
clock: 33MHz
capabilities: pm msi pciexpress vga_controller bus_master cap_list rom
configuration: driver=nvidia latency=0
resources: iomemory:2f0-2ef iomemory:2f0-2ef irq:17 memory:c4000000-c4ffffff memory:2fd0000000-2fdfffffff memory:2fe0000000-2fe1ffffff ioport:3000(size=128) memory:c5000000-c507ffff

Cats and dogs living together!

Bit weird, not gonna lie.

I haven't fully exercised these capabilities yet. I did one quick test to see if I could use OBS to live stream a window being displayed with the AMD GPU, but with the stream encoding done via nvenc on the nVidia GPU. That seems to work well. What a time to be alive! 

Thanks Martin, you're not getting this back. :D