Hahaha! Go Me.

Filed under: Technology — Shu @August 2nd, 2022 1:06 am

Somewhere in 2011, a patent t… excuse me, a “non-practicing entity,” sued Apple. Apple counter-sued to invalidate the patent. I just found out that Apple entered my mashup book from 2007 as an exhibit in the case. Awesome.

I Made the Front Page of HN

Filed under: Technology — Shu @June 2nd, 2022 1:12 am

Some friends and I were talking about civil litigation earlier this week, which made me wonder if anyone at HN would be curious about my experience with arbitration as a civilian. Right before I went to sleep, I submitted a link to last month’s post to HN and went to bed. The timing would have hit Central Europe around lunch, and EST at the crack of dawn. I figured if it was going to take off, the algorithms would start pushing it up throughout the Western Hemisphere all day.

And it worked! I woke up the 500+ more points and it kept rolling in throughout the day. I dove into the discussion, elaborated, argued, and generally watched the mood.

Participating in HN was cool as well as getting that sweet, hard-to-get HN karma, but the other coolness aspect was watching my little 4 GB Linode instance, that I manage myself, handle things like a champ. No complaints about the site being hugged to death in the comments, and every time I clicked, you’d never know it was being slammed.

Congratulations, someone at You were (maybe) the first human to hit the post from HN at 9:52:41 UTC.

Each “hit” makes thirteen simultaneous requests for CSS, JS, and HTML from this server (8 more for Google’s fonts). However, they are very small at just 279.6 KB total. I credit my lack of sophisticated front end skills for that and Skeleton. On the server side, each hit spawned a php-fpm process that consistently took 3% CPU, 1.5% Memory. The server peaked at around 75% CPU a few times.

Currently, the post has 960 points, which is pretty high. Most front page stories are under 200. I have now made the front pages of /., digg, reddit, and the most coveted, HN.

All in all, fun experience.

“Arbitration consumer protection attorney here! Nice work, and nice write up.”
“Damn, that was helpful and encouraging. Pound for pound the best value of any article I have read on HN.”
“This was an excellent write-up! The TL;DR is perfect”
“This is interesting, thanks for sharing and good work!”
“As an attorney, I hope everyone reads this.”

So I Took a Huge Corporation to Arbitration. This is How it Went.

Filed under: Random Stuff — Shu @February 27th, 2022 3:54 pm

It went pretty well.

Last Fall, my hot water heater started to leak. I called my home warranty company, (Hereafter, HWC. I KNOW, I KNOW, HOME WARRANTIES ARE NOT WORTH IT.) and due to a computer system failure that spanned multiple days, they were unable to send a contractor to fix my water heater.

With water starting to leak through my wall, I found my own plumber, paid for it out of pocket, and then sent HWC the bill. Sending HWC an invoice and getting reimbursed is actually an allowed part of my contract under certain circumstances. One of those circumstances, though, is obtaining a pre-authorization from HWC. However, with their systems, being down, they were also not able to pre-auth. This seemed like a pretty clear cut breach of contract on their part, so I figured I was in the right.

Of course, they refused to pay. Trying to find someone at HWC who could show an iota of free-will and thinking was hilarious and impossible.

My oft-repeated question, “You wanted a pre-auth, but you were unable to even accept a pre-auth request let alone deny it?” met with everything from silence to just repeating, “We need a pre-auth.” I could actually hear the blank eye-blinking on the other end.

No one I talked to on the phone, from front line call handler, to managers, to case specialists, had any authorization to do anything. No one deviated from their procedures and script to the point where it was excruciatingly predictable on what an individual would say. I was very impressed that this company so effectively neutered all staff that had any contact with customers.

After about five calls of runarounds, I realized that customer service channels would not help me and I had use the legal system.

In the US, it’s not easy to sue a company as a customer. Maybe about thirty years ago, it became common for companies to put into contracts and terms of service, clauses that required binding arbitration to resolve disputes. This was cheaper for the company in many ways including a streamlined resolution process, cheaper lawyers, no sifting through frivolous lawsuits, and finality once a decision is made. It sucked for the consumer because it’s essentially a privatization of justice.

My contract required arbitration from the American Arbitration Association, which is a non-profit organization. That made me feel a bit better. I filed a case online for $200, asking for $1,800 in claims. What’s interesting is that in the filing, I only very briefly described what happened. I summarized the entire timeline into three sentences. I provided no evidence. I didn’t want to give my hand away before arbitration. I then waited.

I didn’t have to wait long. Two days later, I received a response back from HWC’s attorneys. I apparently finally got someone’s attention. In the email, they denied responsibility but offered 50% settlement and reimbursement for the $200 filing fee for me to go away. Remember, I purposely withheld detail and evidence in my filing, and yet I was offered a quick settlement. I would love to see the math that goes into this strategy. This was the most interesting part of the whole experience to me.

I respectfully declined. In my email, I, again, was fairly light on details. However, the focus of my email was to exactly point out which clauses in the contract they failed to uphold, why they failed, and how I tried to uphold my end of the contract. Per my obligations, I first tried to escalate and requested an emergency contractor dispatch. They could not due to their systems being down. This violated Section X of the contract. A day later, I requested a 3rd party pre-auth to find an outside plumber. They could not due to their systems still being down. This violated section Y of their contract. These were the only areas where I revealed case details. Again, I sent them no hard evidence like screen shots or call logs. In the end of my response, I counter-offered 80% plus arbitration filing fee reimbursement. The next day, they responded back with an acceptance and sent over a release form. Case pretty much closed. I didn’t get 100% back, but the stupid co-pay and service fees I would have been responsible for if they actually came out would have made my offer more like 90% anyway.

Throughout this, a case manager at AAA checked in with us to see how things were going, and once we told them we were in settlement negotiations, she checked in on that process, too. I had a few questions about the process, and she quickly answered those.

I signed, scanned, and emailed back the release form and about two days later, got a Fed-Ex’ed check in the mail.

The takeaway from my story is:
1) Don’t be afraid of arbitration. I did feel AAA was helpful in the process. In fact, heading straight to arbitration may be the best way to fight this system that corporations created.

2) READ. YOUR. CONTRACT. I swear, this got me like 90% there. It doesn’t matter that I don’t have a background in contract law. These customer contracts are designed to be (relatively) approachable. I felt being able to specifically point out what clauses were violated and how gave me a very strong cause of action. I also pointed out there was no out-clause in the contract for computer system failures. I suspect this had something to do with them accepting my counter-offer so quickly. Once the attorney saw that yes, there were breaches of contract, they probably knew my case was pretty strong. The world doesn’t care about you feeling like you were wronged and how hurt your feelings were. It wants cold, hard facts.

3) Keep records of phone conversations. Date, time, who, resolution. Keep screenshots. Keep emails. I didn’t have to present them to an arbiter, but I think if I had to, I had the evidence to point out exactly where HWC failed.

4) Don’t get a home warranty.

Galapagos Photography Tips

Filed under: Technology — Shu @September 4th, 2021 4:25 pm

The photos are Here.

The Galapagos is often a once-in-a-lifetime trip. So, in preparation, I wanted to get it right for photography. I did a bunch of research beforehand, and found some good tips and not good tips. The bad ones will show up in google searches, so be careful.

You have no control over time and space.

If you take a standard Galapagos cruise like most people do, you are not in control of your schedule. You go everything with your fellow passengers. In excursions, you are often hurried along. You sleep on the boat. You wake up, you eat breakfast on the boat. You get on a dinghy to an island. You have an activity, whether a hike or scuba, in the morning, go back to the boat, and have a hike or scuba in the afternoon. Your itinerary is heavily regimented, and you cannot go anywhere without a guide. This is the main point to remember. There are ramifications to this:

  • You cannot go anywhere off designated trails to get the perfect shot. No climbing on things, either (duh).
  • You will not be able to camp out at a spot for extended periods of time waiting for a perfect shot. Guides will tell you to keep moving with the group to stick to the schedule.
  • You will not have any control of where you are Golden Hours. You’ll pretty much be on the boat in the morning, so forget that one. Evenings it depends on the time of year, but fair chance you’ll miss those, too.
  • You will not be able to return to a location. These tours have a set itinerary. When you’re done, you’re done. If the weather didn’t cooperate, that’s a pisser. You’ll have to live with full sun beating down on you or full clouds.

You will be snorkeling.

This is a good time to try underwater photography. I used my old Canon XSi because it would have been quite a bummer to screw something up, have the bag leak, and destroy my primary camera on this trip. I bought a Ewa Marine underwater bag and it worked perfectly. I didn’t want to blow $1-2K on a real hard case just to goof around under water, nor did I want to trust a camera with a cheap Chinese made bag from Amazon. The Ewa Marine bags at $300 was a good compromise. It worked perfectly.

Prior to my purchase I asked photography forums about this bag. Almost universally they ragged on it, saying bags are untrustworthy, no matter the equipment, it will eventually leak, blah blah blah. It was all idiotic, elitist BS. I might use this bag once a year. I’m not going to be abusing it. A $300 bag for occasional use is fine. Buncha dumb gatekeepers who can’t keep things in context.

Only thing I would have changed is maybe I should have gone ahead and gotten a red filter. The visibility is not as clear as you would expect in some areas. Five meters under water is really not a lot. Your subjects will likely be more than five meters away in any direction. Plus, that rule is for full sun, which even in July, we did not get at all during our trip.

You probably won’t need a tripod.

Maybe a monopod. But see the point above for having to move often. You often won’t have time to set everything up perfectly, take your perfect 3 second shot, and move onto another angle to repeat. I had a camera backpack with a small tripod and never used it.

Plus, the trails are *very* rugged. The trails are often nothing but large volcanic rocks that require climbing. It would be a pain in the ass to take a sturdy Manfroto tripod with you.

You will need a good telephoto lens

One forum I ran across, someone said you won’t need a telephoto lens because the animals are so tame. Utter BS the worst advice about Galapagos photography I found. First off, they’re animals. They’re not obligated to get right up to you. Second, you absolutely cannot deviate off the trails. Our guide had stories where he’s had to boot people from the tour because they kept wandering off. This rule is no joke. You will need a good telephoto to reach those beyond the immediate trail areas. Third, birds. You can get some magnificent pictures of birds in flight if you have a good lens.

It definitely was a trip of a lifetime and I got the chance to take some amazing photos. Just remember to bring your best equipment, whatever it may be, a lot of common sense, and understand that while the wildlife is tame, the restrictions offer some challenges to get the perfect photo.

I was in the Galapagos for a week.

Filed under: Technology — Shu @August 14th, 2021 2:20 pm

And spent almost two weeks total with Kali Linux on an old MacBook Air as my only computer. People are often warned that Kali should not be considered as a normal everyday laptop due to hardware compatibility issues, so I was curious how true that was.

Why Linux and Why Kali?

I wanted a laptop that could do everything I needed, but I wouldn’t cry if it got lost or stolen. Everything about the Galapagos is very controlled and safe, but we were putzing around Ecuador for a few days before with long layovers in Mexico City and Houston. Linux on an ancient and clean laptop was an ideal choice. Everything would be locked down, and if it got stolen, the damage of losing accounts and personal data would be isolated.

Kali was chosen because I had been working with it for a while already in VMs. I didn’t have enough time beforehand to screw around with changing desktop environments, etc. I needed something up and running sooner rather than later. I hate GNOME, and the lighter weight of xfce would be ideal for the older specs of this Air (4GB ram, 1.8 i5).

What Was I Going to Do With It

  • Standard web, email, youtubing
  • Slacking, Discording, and Signaling
  • Docker to screw around with coding if I had down time
  • Image sorting and previewing of RAW photos. Not editing.

What Went Well

Web, Email, and Youtubing

Stellar. Absolutely no problems. The included Firefox worked well, but I installed Brave and went to town. I never had any problems with any sites including ordering stuff off Amazon. Email we done via web clients. No problems there, either, and with passwords never being saved, I didn’t have to worry.

Slacking, Discording, and Signaling

No problems here, either. The fan kicked on due to the high demands, but they all worked great. You had to add repos for all three, so it’s one more step than apt-get install with the command line or GUI. I was able to keep in touch with friends.

Docker and Coding

No problems here, either, but then again, I really didn’t have time to do much. I downloaded a simple Python editor and goofed with code here and there on the plane, but nothing too heavy.


No problems connecting to my work VPN (OpenVPN) via the commandline. I didn’t need to, but felt connected to the real world should there have been any emergencies.

What Sucked

Image Sorting

I wanted a way to review the hundreds of pictures I would take per day and quickly toss out any that obviously sucked. I didn’t want to edit, because I knew this 4GB machine wouldn’t be able to handle it.

Overall, the experience sucked. I didn’t get a chance to download a real program like RawTherapee or Lightzone beforehand. Still, it wouldn’t have made things better due to their high demands and it was overkill for the task. There are two included programs in Kali that can view RAW images. Ristretto which had a bad habit of opening *every* file in a directory, which killed my workflow. Atril, which opens all images in their own separate window, didn’t tile the windows sequentially, and doesn’t have a deletion feature.

I ended up using Brave of all things and then manually deleting later. Brave at least let me quickly view images in sequential order. The only thing I was happy with was the SD card in the Air which allowed me to easily transfer the files to the hard drive. Kali handled this just fine.

Freakin’ NTP of All Things

Shortly after installing, the system clock would always set itself to February 1, 2021 8:00 am. I have no idea what is so special about this timestamp. This was a huge problem because it made SSL certificates on sites invalid, which nuked web browsing.
Looking online, there were a bunch of posts blaming timesyncd, system clock, and NTP. I eventually found one solution that work: https://github.com/nu11secur1ty/Kali-Linux/tree/master/NTP

This took a long time to find and I’m still not sure why it worked, but it’s the only solution that worked. Clearly, though, this is a problem that Kali should address.

Waking from Sleep/Suspension

I didn’t dive too far into this, but there are a lot of discussion across all distros about how their laptops won’t wake from sleep correctly, often forcing restarts. My experience mirrored this. If I closed the laptop or set it on Suspend/Hibernate in the UI menu, sometimes I could wake it later by pounding on the keyboard, opening the lid, etc. Usually I couldn’t and I had to force a restart. I ended up just shutting down whenever I remembered.

The most common “try this” solution was to increase the swap size. I did, but it didn’t work. It seems the system would wake, but the display wouldn’t. This is probably a huge problem tied to hardware that is not worth exploring.

Hotel Webcaptive Screens

You know those screens that popup, when you go to a website, in cafes and hotel networks, that require you to accept the terms and agreement or enter your room number, before you can connect? Linux in general, Firefox specifically, don’t handle these well and generally won’t show those popups.
The most reliable way is to connect to a network, go into the terminal, curl -LG to Google, see the URL that it really connects to, copy/paste that URL into a browser, then accept the terms. Otherwise, nothing connects and you’re wondering why Linux can’t wifi.


Yes, a very pleasant experience if you know what you’re doing! You do need a fair amount of Linux experience to figure out things like the NTP issue and webcaptive issue. It met almost all my needs, and I felt safe with it. I will definitely use this setup for traveling out of the country in the future.

Books on macOS and iDevices are a Mess

Filed under: Uncategorized — Shu @June 25th, 2021 7:16 pm

What was once a simple paradigm between syncing books on desktop machines and iPhones are now a mess thanks to iCloud, the detachment of iDevices from the desktop, bust mostly just plain half baked implementations. I simply want to take a PDF or eBook and read it either on my desktop, my iPad, or iPhone. The later two would have to support offline reading, like on a plane.

After googling and getting nothing but old blog spam from 2013 and countless toggling of settings, I have discovered:

1) If Books on macOS or iOS/iPadOS is set to use iCloud Drive, you will not be able to transfer a book on your hard drive to an iDevice. Syncing does not work, dragging and dropping to the Books section on macOS does nothing. The user is left with no error messages or warnings.

2) You cannot see your Books through icloud.com nor iCloud Drive on macOS. The only way to see Books on iCloud is to turn on iCloud for Books on the Mac or iDevice.

3) Dragging a book/PDF from your file system to macOS books when iCloud is turned on for Books will upload the Book/PDF to iCloud. Maybe.

4) Unless you want *all* your books and PDFs to be on iCloud, you need to download everything to your Mac, delete them from Books, and then manually sync. Otherwise, the ridiculously small 5GB iCloud space you’re given will be eaten up real quick.

5) I purchased Neuromancer. Books keeps offering the other two books in the trilogy, and I can’t remove them. I can’t remove them from my macOS or iDevices. My wife bought a series of other books. I can’t remove them from my macOS or iDevices, either. I can’t get rid of this Winnie the Pooh shit, either.

6) You have to go into your account settings to unhide books. There isn’t an easy way to see *all* your purchases from the store. There is no Hide functionality in macOS Books. I’m not sure how to bring them once deleted.

I wish someone could explain to me this paradigm. If it’s just an eBook store, I shouldn’t be able to manage my own PDFs and eBooks with it. Either way, the behavior shouldn’t be completely different when one device is using iCloud.

TL;DR – Apple’s eBook implementation is a complete clusterfuck. Turn off iCloud usage for Books on all your devices and just manage it yourself via the sync function on the device in macOS.

Adventures with Python’s Asyncio

Filed under: Technology — Shu @June 11th, 2021 6:02 pm

I finally got the chance to work with Python’s Asyncio package in a semi-real world project recently. I’ve had a scraping program that scrapes investment fund data and inserts it into a database. As the number of funds grew, the number of investment vehicles grew. In revisiting the app, I decided it was a good time to finally try rewriting it with asyncio.

Asyncio has been out for years now, but it’s gone through some criticisms and knocks. Instead of trying to wrap my head around it, my strategy has been to wait things out and use Go for data engineering work instead. Go makes concurrency wonderfully easy, but I didn’t want to redo this app in Go because of this app’s reliance on BeautifulSoup for parsing, which is pretty incredible. Through a few Python releases, asyncio has been honed, simplified, and abstracted out for mortals to use, so I thought I’d give it a shot.

There are two key steps in this app that would benefit from async – the act of retrieving and parsing of a page, and the act of saving the data to a database. I read a few tutorials and gave it a first pass. The results were pretty surprising.

Trial 1: Synchronous, Parsing and Clean Database (832 Parsings)
Trial Milliseconds
1 366083
2 298531
3 258822
4 245793

Average: 392307.25, SD 54056.86

Trial 2: Asyncio, Parsing and Clean Database (832 Parsings)
Trial Milliseconds
1 530063
2 456371
3 429044
4 438373

Average: 463462.75, SD 45825.87

There was a noticeable “warm-up” with the first runs, but subsequent runs were usually faster. The key takeaway, though, was that while asyncio was more consistent, on average, it was almost 60% slower than synchronous!

Clearly I was doing something wrong. I suspected three things:
1) I did not fully understand the package and wrote the whole thing incorrectly. (High suspicion)
2) Database operations were not being asynced. (High suspicion)
3) BeautifulSoup and downloaded was not being asynced. (Low suspicion)

I attacked #1 and #2 simultaneously.

I started to re-read tutorials and documentation. I made some changes to the code, re-read a lot of things. I mainly learned that a lot of blog posts and documentation out there is outdated, needlessly complex, and/or did not simulate real world situations. Sleep() does not replace blocking operations IRL! Defining a simple async function that is awaited definitely fit my needs, and thinking back on work projects, maybe 80-90% of my historical data engineering needs.

After this, subsequent runs were similar numbers. The problem must be elsewhere.

I went down the rabbit hole of asyncing the database operations. The project uses SQLAlchemy for database operations. The result of this dive was not good. SQLAlchemy’s async features are in beta. That’s fine. However, the drivers need to be async, and the only reliable one right now is for PostGres. There is for MariaDB/MySQL in aiomysql, but I ran into so many bugs that are still open, I had to abandon that effort.

So, I was ready to just ditch asyncio for now and stick with synchronous code. Just to test, though, I removed parts of the code that wrote to the database. My code would just download and parse. Guess what? Similar results. The async version was much slower. Database operations were not the blocking factor.

At this point, I felt more confident with how asyncio worked. I thought this through. The main workhorse functions did parsing and database insertions. Since they were defined as coroutines, they all had to run and complete. The parsing had to occur before the database operations. There was no way around that. Whether BeautifulSoup or SQLAlchemy was async or not didn’t matter. Both had to run, so both were in the same async’ed function.

What else could it be? Reading more and thinking about it, there’s a lot of overhead in asyncio, and have noted it is slower in certain use cases. Maybe there was an economies of scale issue here.

I took this opportunity to update my dataset. I was parsing about 832 investment vehicles with each run, but those were out of date. I updated the funds I was monitoring and this jumped it to 1,049. I then re-ran both the async and sync versions.

Trial 3: Synchronous, Parsing and Clean Database (1,049 Parsings)
Trial Milliseconds
1 338347
2 256635
3 249421
4 241273

Average: 271419, SD 45057.80

Trial 4: Asyncio, Parsing and Clean Database (1,049 Parsings)
Trial Milliseconds
1 295613
2 267927
3 245697
4 231022

Average: 260064.75, SD 28138.98

Both methods sped up, but the async version finally decreased below the synchronous version! And it was way more consistent!


  • By increasing the amount of work, the amount of time in running async went from 7.5 minutes to about 4.3 minutes. This leads me to believe there is an overhead cost with asyncio, and you will not see the benefits on small datasets and projects.
  • Multiple blog posts say “async/await creates a coroutine.” This is not true. async creates the coroutine. Await just means run this call until it’s done.
  • That means it probably doesn’t matter very much if libraries you use to make blocking operations are not async-ready. Don’t stress it. You can just place them in a coroutine (defined by “async def”), await the call, and it will be ran until complete for you.
  • Only coroutines, again, defined by async def, can be awaited. (Except generators, which is another can of worms.) Unlike Node, it’s fine to define an async function without any awaits because the whole thing becomes just an awaitable coroutine.

All in all, good fun, good practice, complicated, and a lot easier to control than Javascript’s async operations.

Birch Bay Race

Filed under: Keto,Races — Shu @April 11th, 2021 10:14 am

15k, 1:37 time, 10:27 pace, minorly keto’ed (I don’t think that fat adapted). Terrible time, but it was the first race in about a year and a half.

The route was beautiful. About three miles runs along Birch Bay, into Birch Bay State Park for a mile or two, then back up along the bay. The sun was out, but pretty much did nothing to the temperature. It was quite windy. In certain parts, you running into a strong wind. Annoyingly, when we checked out the next day, there was almost no wind in the area.

Feels pretty good to get out, but I’m definitely not fully fat adapted, and after two years, I’m still dealing with this piraformis issue. The both kind of feed off each other. I’m not doing the 25 miles per week training due to the butt issue, which inhibits me from getting as fat adapted as I want to be to get under 10:00 pace.

Stop Recommending Home Assistant for Home Automation

Filed under: Home Automation,Technology — Shu @November 18th, 2020 8:05 pm

This post expands and refines on a comment I made in response to a blog post by Rob Dodson in which he explains why he ripped out a $9,000 automated home lighting system. His post centers around the Z-Wave device protocol, but touches upon Home Assistant, a popular open source control center for home automation devices.

About five years ago, I was looking for a home automation system to control my lights and security system. I choose Home Assistant as a controller on a Raspberry Pi to control Z-Wave devices.

  • Home Assistant is open source, aka free.
  • Raspberry Pis are cheap.
  • Z-Wave protocol is ubiquitous across many devices like switches and sensors.

Basically, it was a cheap solution that gave me a lot of flexibility beyond just controlling lights. I regret this whole decision whenever things go wrong, and it often does.

First off is Home Assistant. There is absolutely no way a non-Linux user would be able to use this software. Despite their claims and goals, it is not designed for the average user who even is fairly tech savvy. They have tried to be more user-friendly in the past year or two by moving a lot of admin functions to the web interface. However, so much troubleshooting inherently involves command lines in Linux, it is now in this obnoxious state where some operations need to partially be done in the web interface and then continued in the command line. For example, deleting a node. You have to initially “delete” it in the web app to clean things up, then manually remove them from the Home Assistant configuration files.

Another issue with HA is how they love to completely rewrite and break things for the sake of rewriting things. In my five years with HA, I have witnessed recommended install methods get created, rebranded, and deprecated (hass.io), complete overhaul of the UI architecture, complete overhaul of the Z-Wave architecture, and non-functioning items inserted as placeholders into the web interface for features to be implemented later. You *have* to read the release notes before updating. I’ve run into occasions where pip upgrades will break the whole setup. Good luck googling for answers in this environment when you have a problem. If the answer is more than a year old, it’s probably not relevant anymore.

“Well, just don’t upgrade,” you may say. The interface nags you when an upgrade is available. It is much softer now, but previously, it inundated you with security warnings about running an out of date version. If you’re an average user, who just trusts developers at face value, you’re going to upgrade. Because security!

We wouldn’t have to google problems so often if not for the documentation and terms. The documentation is not terrible, but it is not geared towards end users. Home Assistant abstracts out everything it touches into an entity and node. If you simply want to do X, you’ll have to identify what is the entity in your setup, and what is the node. When you add a Z-Wave device to your network, Home Assistant creates an entry for that one device in the user interface under Entities, Nodes, Switches, Z-Wave, and Devices. Five years later, and I still can’t tell you definitively what the difference is between the five.

It doesn’t help Home Assistant with how flakey Z-Wave and devices themselves are. If you want Z-Wave light switches, you’ll find three brands – GE, Honeywell, and Jasco. However, they are all Jasco. The first two (and maybe also some Leviton switches) are just rebranded Jasco. The first generation of Jascos are complete garbage. I’ve had a 40% failure rate with them. They’ll randomly die or fry in a power failure. So far, knock on wood, no failures with the second generation.

It’s extremely common for switches to misreport their state to Home Assistant, and Home Assistant can’t do anything about it. A switch may tell Home Assistant, “I am off” or “I am on,” but the actual mechanism to turn the light on and off may be broken. What’s Home Assistant supposed to do? Ask the switch, “Are you lying to me?” Home Assistant is trying to herd cats. This is a problem with not just home automation, but any technology that is built on a controller/worker architecture.

Home Assistant/Z-Wave delivered on my desire for a cheap and flexible option. Is it reliable? Nope. Is it easy to maintain and fix? Ha. All too often, I see recommendations for Home Assistant when someone asks, “I’m looking for a cheap home automation system.” Unless I know that person has experience in Linux, it’s a disservice to recommend Home Assistant. On the other hand, looking at other platforms and hubs, Home Assistant still gives you the most flexibility and consolidation. This really says more about the dismal state of home automation rather than praise for Home Assistant.

It’s 2020, and home automation is nowhere near “just works.”

Google and Apple Have Been Screwing Around With Email Security and it Doesn’t Actually Help Security

Filed under: Technology — Shu @September 19th, 2020 3:55 pm

TL;DR – Two-factor authentication (2FA) with Gmail, Apple’s native mail clients, and an email forwarding service is much harder to set up than it needs to be due to poor communication by both Google and Apple. The end result is that it’s just much easier, if people can’t sacrifice Gmail/Apple/Forwarding, to just turn off 2FA which is a bad thing.

I try to do the right thing. I turn on two-factor authentication on my Gmail account, don’t resuse passwords, etc. However, getting this up and running across all my Apple devices was a PITA and took a huge amount of googling and experimenting. Apple and Google’s history on mail, like just about everything, has been a clash of tech egos and insanity. I went through periods of turning 2FA on and off to get things to work correctly, and I can only imagine a non-technical user to just say, “fuck it” and keep it off. So despite Apple and especially Google’s preachings, their bickering and hair-pulling business decisions have helped no one but hackers.

In the old days, websites would just require a password to sign in. Two-factor authentication became popular to secure sites more important sites. Two-factor authentication is the process where a site or company requires a second way to verify you are trying to sign in besides just a password. Banks typically do this. If you are signing in on a new computer or browser, they often will send you a code in a text message on your phone, and you have to enter that code in the browser before they’ll let you in. The idea is that it’s harder to compromise both your password and your physical phone instead of just the former.

Google is Mostly to Blame

Instead of a simple text message, though, Google requires you to authenticate via the Google App. You have to download their Google App on the phone, and when you sign in elsewhere, the verification is done in that app. You have to click “Yes, I am trying to sign in on the desktop” in the Google App. There are also scenarios where they also text you a code to enter somewhere in addition to the app.

This is all fine and dandy, and makes it slightly harder to do an attack called SIM Swapping, but there are plenty of scenarios where you aren’t signing in via a browser. For example, an email application. I also have security cameras that are controlled by the camera maker’s app. I have to enter in a user name/password like the old days to email me when motion is detected. For those scenarios, Google introduced two methods – a setting called “Allow Less Secure Apps” and “App Passwords.” The former allows your username/password to be used anywhere in an old school way, and the later is available when you have two-factor authentication enabled. App Passwords generate one random password per application and ties it to a device, basically forbidding you to reuse passwords. So if someone breaks into my camera system, they can’t use that password to sign in on a Windows XP machine in Russia.

The problem is that Google, like usual, does a crappy job at communicating/educating the layman about these changes. Further, again, like typical Google, you can’t ever trust how long they will keep supporting a method or product.

For both “Allow Less Secure Apps” and “App Passwords,” they freakin’ warn you all over, that this is unsecure and you shouldn’t use it. That’s definitely true for Allow Less Secure, and technically true for App Passwords, but come on. App Passwords is essential for those scenarios without a browser and is actually a pretty good security mechanism.

Now you’re in a situation where App Passwords are perceived to be dangerous, and because Google warns you that it’s a bad way of doing, they imply they may yank them in the future. So you don’t use them. However, you still have to set up the security camera to email you when a rabbit is running across your driveway. How do you get that working? Turn off 2FA and turn on “Allow Less Secure Apps.” Congratulations, you are now no longer following best security practices. Despite what Google wants you to do, their communication compels otherwise.

Apple is Also Mostly to Blame

I fully admit a lot of Apple’s blame is because of my special snowflake setup. In the email specs and most apps including the Gmail web interface, you can choose a specific “Reply-To” address that is entirely different than the email address on the system. I use this feature. I use Gmail, but I never give out my Gmail address. The Reply-To address is my alumni address. I send out email through Google, and recipients’ email clients are supposed to use the Reply-To address. They don’t always and instead respond to my Gmail address. It works, but my Gmail address is not my “real” address. My school alumni address forwards to the Gmail account.

In my encounters, this is an uncommon requirement, but I’ve met many people, and organizations, where this setup is used for various reasons. It isn’t exactly a rare edge case.

Apple’s Mail applications has been inconsistent and slow at incorporating Google’s security changes for both macOS and iOS. The smoothest way for native applications to support 2FA is to launch a browser window, kick you over to Google, make you sign in with Google, Google then says yea or nay, the browser then tells the OS yea or nay. *Hand Waves* this is a type of 2FA called Oauth.

It’s Been Inconsistent on macOS

There are two ways you can add a Gmail account to macOS Mail.app. You can choose “Gmail” which does a lot of under the hood magic, or you can directly enter Gmail server settings via “Add Other Account…” I hated the “Gmail” method. It never allowed you to change your Reply-To address and hid what the hell was going on. For years up to and including Mojave, I had to use the later to accomodate my special snowflake email forwarding setup. It worked ok until 2FA was turned on. When you added an account via “Gmail,” it at least supported the Oauth flow to work. If you add it via “Add Other Account…” it doesn’t. So it looked like you were dead in the water with 2FA, Gmail, and a forwarding email.

However, you actually weren’t dead in the water because you could set up an App Password. The problem was Mail.app usually didn’t tell you what was going on. It just spun and spun until time out and just says “imap.gmail.com is not responding.” Occasionally a dialog box pops up telling you to use App Passwords, and that’s how I discovered that was the issue. I set up an App Password, an lo and behold, it works. Now, this specific issue could have been 100% Google’s fault. Maybe they didn’t provide a usable error for Mail.app to report. Who knows.

Giving Apple credit, they did indirectly fix things in Catalina. If you add an email via the “Gmail” method, it now allows you to change your Reply-To email address. So you’re kicked into the Oauth flow and can use an email forwarder. As far as I know, though, “Add Other Account…” still doesn’t support Oauth.

It’s Been a Consistent Dumpster Fire on iOS

iOS is a different story. Setting up a gmail account as “Gmail,” does not allow you to change your email address in outgoing emails at all. This is true even up to iOS 14 released a few days ago, so that is not an option. Setting up the server directly, never gives you the App Password warning. It just spins, timesout, and says something like “imap.gmail.com is not currently available.” So if you only use email on your phone, you’ll never be told to use an App Password. Again, still the same as of iOS 14.

I can’t imagine how you would figure this out if you didn’t have macOS or if your Google skills weren’t exemplary. If you use a Reply-To, it is just be easier to turn off 2FA and set up Gmail as a regular IMAP server. You are led to believe that 2FA, iOS, and email forwarders are an impossible combination.

Other Apps: Yay!

All this mess made me explore other email clients. The thing they all have in common is that they do not try to be clever and hide things, and allowed you to change your Reply-To address.

Edison on iOS was great until it wasn’t. Hopefully an issue like that won’t pop up again.

Thunderbird is a throwback that is usable, but the interface is woefully outdated for modern wide screens and laptops.

Yeah, I actually used Outlook for a while on my 2015 MBP (the last great Macbook Pro) on Mojave. It ain’t terrible.

In Conclusion, This is What You Need to Do…

With a lot of finagling, you can actually use two-factor authentication and special email setups with Apple software. On macOS, in < Catalina, use an App Password via server settings directly. In Catalina and hopefully beyond, add the account as a Gmail account. In any version of iOS, use App Passwords instead of your normal Gmail password. You will never know this strictly from using iOS.

It would be a lot easier if Google wasn’t constantly changing Gmail security up, and Apple stopped trying to be so damn clever with their email clients. Both companies need to stop trying to force people into screen time on their ecosystems at the expense of good security.

Next Page »