Packing for Travel – 2022 Edition

In 2019, I was challenged to write a list of tools I use when traveling. I have not really traveled much since the start of the pandemic, but I have upgraded my gear in preparation for that day, so why not write an updated list?

  • Computer
    • Dell Inspiron 7370 – This is a 13.3″ laptop running Linux  which I bought open-box. As I spent more time away from home, I needed something that wouldn’t slow down under load.
    • USB-C to Dell laptop charging cable – So I could plug an older laptop into a USB-C charger. I also got a USB-C to laptop charging cable for my work laptop.
    • Eleduino 13.3 Inch 2K HDMI Portable Gaming Monitor – There are a variety of these available on Amazon and other sites. I use this as a second monitor for trips.
    • Replaced the Eleduino monitor with a Sansonic EVOPIX 15.6 Multi-Touch Portable Monitor I got in a Woot sale. So I continue to operate a dual monitor setup everywhere, with this as the primary monitor, and the laptop as the secondary.
    • Kabcon Quality Tablet Stand – This is a bit more stable then the tiny stand that came with the gaming monitor. It is designed to hold larger tablets. However, the Sansonic also stands up by itself, so I don’t always use this.
    • Nexstand Laptop Stand – This brings the laptop high enough to handle a keyboard.
    • Royal Kludge RK61 Wired/Wireless Keyboard – Mechanical keyboard that doubles as a bluetooth keyboard.
    • Dierya 60% Keyboard – I still have the RK61 as a backup, but I switched to this because I kept setting off the multi-device mode by accident and the ? and the arrow key were shared on the RK61, but separate on here, and I kept tripping up when typing.
  • Travel Gear
  • Camera Equipment

People-Focused Communication

This week, I ended up in a conversation that referenced Tantek Çelik’s article on People-Focused Mobile Communication circa 2014. I had followed up with my thought at the time on Unified Communication.

My version of the idea didn’t just focus on the mobile experience, but wanted to embrace the idea overall. Which means I’d want it to work on the desktop also. Also, Tantek is an iOS user, but I’m a dedicated Android user, so there is also a different approach there.

The focus was that instead of finding people on service X, you’d find people, then find where they are.

On Android, it has sort of moved in this direction to a degree. Communication apps, if installed, have the opportunity to link directly from the contacts app and add extra information there. So, I can, from a contact in my Contact App, go directly to message someone.

But that is the provider doing that, not necessary the person. Just because I have an account on Message Service A, does not mean I want to be contacted there. It does however mean, if these apps can link in, a theoretical app like this could as well.

So, this means we need something on our websites, under our control, that provides this information. And theoretically, you can visit that page on mobile, as Tantek proposed, or go even further and have an app that presents it for multiple people as a Contact list…either integrated into the built-in system or separately.

So, that means we need two things to start:

  • An HTML presentation of this contact list
  • Some way for others to discover and parse it in order to integrate it into other things, with or without some sort of identity component(making you log in to see some more info).

The first part, the presentation, is where I was back to initially. Tantek had written a list of URLs for People Focused Mobile Communication.

When it came up recently, I wanted to revisit the concept of looking at how protocol handlers were still being used, and their limited desktop use. So I revisited his list, and some others that weren’t really a thing in 2014. I also am leaning toward URLs over custom protocols where possible. Mobile will generally redirect these to the app anyway…

The other depressing thing since 2014 is the increased reliance on phone numbers. This was already starting at that time, but now, it is everywhere. Name a messaging service that isn’t based on your phone number, which is something I generally don’t want to give out.

  • Phone Calltel:phone number – Call someone using a telephone number.
  • Text Message – sms:phone number – This should activate a text messaging service. Variations include smsto, mms, and mmsto. On Apple, I believe, based on research, you can use an Apple ID address in lieu of a phone number, but again, not universal.
  • Facebook Messengerfb-messenger://user-thread/username or http//m.me/usernameUsername or UserID will work. UserID isn’t always easy to find. (More info)
  • Twitter Direct Messagehttps://twitter.com/messages/compose?recipient_id=3805104374&text=Hello%20world – You would have to find your recipient ID, which is considered preferable as the handle could change.
  • Skype Chat – skype:username?chat – You can see the full API including call or group chat here.
  • Microsoft Teams Chatmsteams://l/chat/0/0?users=Joe@Example.com or https://teams.microsoft.com/l/chat/0/0?users=Joe@Example.com (Deep Linking Reference)
  • WhatsApp – whatsapp://15551234567/send?text=Hello%2C%20World! or https://wa.me/15551234567?text=I’m%20interested%20in%20your%20car%20for%20sale . Without the phone number, it will pop up a selector box on who to send the text to. (Reference)
  • Telegram – https://t.me/username?text=Hello%2C%20World! A phone number would only work if they are in your contacts.
  • Signalhttps://signal.me/#p/15551234567 or sgnl://signal.me/#p/15551234567

For some services, you can create a room/group/etc and get a webhook to have people post in there. So it could be a room just for this purpose.

But, let’s say you solve the problem of actually linking to these services. IndieAuth solves the problem of different presentations by allowing authentication. The final problem is a fairly simple one…how do you mark it up to show your priority/preferences?

I’m not sure yet, but I think I will add an updated contact page to my site with more ways to find me.

Pingbacks, Trackbacks, and CSS-Tricks

Earlier today, a post was published on popular site CSS-Tricks that referenced my site and a post I’d written. My site has never been especially popular, and isn’t usually picked up in this way.

I immediately started getting something I haven’t gotten in ages. Pingbacks and Trackbacks.  Now, I spent a time as the Pingbacks and Trackbacks component maintainer for WordPress. I’d very much hoped we could iterate to make these features more than just another ignored piece of WordPress.

Of course, I was more interested in their successor, webmentions, which adopts many of the same principles, but…offers some important changes, most significantly of which, people are still working on it.

In response to the CSS Tricks post, I got 28 pingbacks and trackbacks. I don’t turn them off on my site, because disabling the ability to receive them would also, with the current webmention configuration, disable that too.

But I think I will be adjusting it to immediately remove Trackbacks. Trackbacks have no validation, and I have not ever gotten a legitimate one. WordPress doesn’t allow you to selectively remove one protocol or the other.

Pingbacks, as they do have validation, mean a site actually does have to link to you, not just say it does. But I looked at the quality of those. CSS Tricks seems to have a lot of people republishing its content without attribution.

Some of these, actual WordPress sites, probably running a scraping plugin, don’t even give authorship and the author is set as the admin account. So, not exactly impressive…although one version did seem to be translated into Spanish.

So, does that mean the only sites still sending pingbacks are sites that wholesale copy other content and put it out there? That has a bunch of different problems with it. It makes me ask if I should turn off pingbacks as well as disused by anyone interested in quality content.

There is nothing inherently wrong with reposting content…although I am a big believer in proper attribution. When I post about an article, I usually only share a summary and a link.

So, I hadn’t gotten a pingback in over a year, and when I did, it was notifications of this.

Maybe I will just stick with webmentions and abandon all similar protocols. Eventually, it could in theory have the same problem as pingbacks…namely, less utility. There have been discussions about that from the beginning. But the way that is solved is by iterating. And no one is doing that on pingbacks right now.

I did consider some other choices. I did attend a discussion a few years on different levels of display based on trust. So, an untrusted source, till trusted, unless you prefer moderation, might appear as an additional number displayed in a counter on your post. As it grew in trust, it might add displaying avatars or other information. That might allow me to keep offering the service.

But, unless someone can show me an example of a quality pingback, probably better to shut it down.

Indiewebifying a WordPress Site – 2022 Edition

In 2018 and 2019, I wrote an article on how I set up my WordPress site. It included some summaries of the Indieweb plugins for WordPress I use, and what they do.

Recently, I came across Geoff Graham’s response on CSS Tricks commenting on another post by Miriam Suzanne on implementing Indieweb technology. I asked to speak to Geoff, who I did not previously know, and did so this past Friday.

Earlier that week, someone I had helped configure their WordPress site at a Homebrew Website Club meeting had decided things were a bit too complicated for them. Whenever that happens, I feel like it is a good time to ask…how can we make this better?

I had some suspicions that Geoff might be confused about a few things, and it would give me a chance to not only explain, but use that to plan how to prevent same in future. This article is also an outgrowth of that.

The CSS Tricks post was in response to Miriam Suzanne…who is using a static site, not WordPress. I’m going to focus purely on WordPress.

The IndieWeb plugin you can get for WordPress was originally conceived as sort of a JetPack for WordPress, but because each piece of the Indieweb infrastructure is independent, it does most of this by being a plugin installer/recommender. And it clearly can do better at explaining its recommendations.

The plugin by itself handles establishing your identity as the IndieWeb sees it. It offers an h-card template and widget. H-Card is the markup for marking up information about a person or place. So, this is an element many people opt to put on their site anyway.

Alternatively, it offers rel-me linking. Rel me is just a way to tell visitors that a link to another site is a link to another version of me.  But, a bunch of links to other profiles is also a common website design measure. Your Twitter URL would be marked up with rel=me, establishing your website and your twitter profile are both the same person. To prove that, your Twitter bio should also point back to your site, otherwise anyone could impersonate you. That’s again, about proving your identity against something verifiable.

Other than a few other behaviors, such as telling the code whether this is a single author or multi-author site, to address differing behaviors, the plugin is as simple as possible, but is a good gateway to more.

If you want to continue to build your identity, it suggests IndieAuth. IndieAuth is a protocol. There’s some confusion about this idea, because indieauth.com is a hosted instance of that protocol that uses rel-me links. but WordPress users don’t need any of that. The WordPress implementation is an entire self-hosted implementation built into your site.

So, what is IndieAuth? IndieAuth is a protocol, based on top of OAuth2. If you haven’t heard of OAuth2, it is what those, Login with Google or Login with Facebook buttons are based on. IndieAuth allows you to log into any site with your URL as your identifier. If you use the WordPress version, you put your URL into an application that supports IndieAuth, and it will redirect to your WordPress instance to authenticate by logging into that, then redirect back to the application. So, for WordPress users, it is really Login with your WordPress site.

A Micropub client is a great example of something you can use IndieAuth to log into. The Micropub plugin adds a Micropub server, or endpoint to your WordPress site. This allows you to use any Micropub client to post to your site. That gives you an infinite number of publishing apps, if, for example, you aren’t thrilled with the built-in WordPress editors.

The Webmentions plugin for WordPress handles the receiving and sending of webmentions. Like the IndieAuth plugin, people often think it requires webmention.io, which is a hosted webmention provider. The WordPress version is entirely self-contained.

Back when it was built, the plugin handled only the business of receiving and sending webmentions, not handling display to any degree. Semantic Linkbacks, a separate plugin handled that for not only webmentions, but the older pingback and trackback protocols.

For the duration of the pandemic, the primary developer and I have been working on a complete reimplementation of the Semantic Linkbacks display code inside the webmentions plugin, and hope to have that done soon, which will eliminate the split(although deprecate support for enhancing pingbacks which wasn’t really happening anyway).

Semantic Linkbacks takes a webmention, which is a notification that another site has linked to something on your site, fetches the other site, and tries to render a display of the information. How that is done can vary from just a profile photo(if it can find one), to interpreting it as a full comment.

It does this using Microformats…a way of marking up HTML to allow elements to be identified. It is one of several ways of doing this, but is a very simple and readable one, which is why it is popular in the IndieWeb community.

Being as many themes are not properly marked up, we did try creating a plugin to do this with WordPress hooks and filters…the Microformats plugin…but its ability to do so is limited. Which is why you are likely better off with a properly marked up theme.

Since many people are not inclined, or not comfortable modifying a theme, the new version of Webmentions will include several different alternative ways to try to find an image or summary to display…from OpenGraph(which Facebook and Twitter use to display URLs provided to it) to detecting the WordPress REST API version of a page and using that to get the author name and profile image. None of them will provided as much context as Microformats, but the experience will still be something worth installing.

The other plugins provide other useful functionality for a site interested in taking the place of your participation in social media silos.

A popular goal of members of the IndieWeb community is to syndicate their content to those sites and pull back the interactions to their own websites. However, most people do not want to write integration to the APIs for these sites.

A community member offers Brid.gy as a service for feeding back interactions to syndicated posts from various other sites, by implementing their APIs, and then sending webmentions to your site when someone comments on the syndicated version of your post. The same could be done by implementing the API directly.

Syndication Links helps with syndication by offering a marked up display of links to the syndicated copies of posts. These look similar to the ‘Share with Twitter/Facebook’ buttons many sites have, except they link you directly to the syndicated copy of the post on those sites, instead of implementing tracking or other code in your site.

Brid.gy also offers a service to publish to sites it supports, and Syndication Links optionally leverages a way of triggering that capability, as well as allowing Micropub to trigger it. It supports several other services as well, and may be expanded to more in the future. But if you don’t want this feature, it is actually disabled by default.

Simple Location is one of my geekier projects. It obsessively adds location context to posts on your site. So, add a location to a post, show a map…the weather, etc at that location. It also adds archives and other data. If you are trying to reproduce the experience of Swarm, or other check-in type functionality on your site…add maps to photo posts, etc, that’s what it is for.

But if that isn’t what you want, it’s fine not to install that piece. Because not everyone’s needs are the same.

If you don’t want to learn how to markup individual types of posts(as opposed to your theme) with Microformats, the Post Kinds plugins tries to add the ability to post a reply, like, check-in, etc from your site. It is integrated with the Classic WordPress editor, however, so some may opt out of it.

The IndieWeb implementation on WordPress is a serious of building blocks that you can or cannot choose to use, which is what makes it wonderful, but sometimes confusing. WordPress has a philosophy of decisions, not options. But the IndieWeb is all about options…about building the features that are right for you.

As WordPress users within the IndieWeb community, we can always do a better job of explaining what these things are for, and are happy to do so. We have a live chat, weekly events, and are generally happy to help. But the IndieWeb is not a monolith…we’re a community of people with a common philosophy of using our own websites rather than someone else’s.  That means different things to different people.

 

Thinking about Planets and Challenges

Earlier today, at the special Transatlantic Bonus Homebrew Website Club, we continued a discussion on trying a community challenge to create content, similar to some of what micro.blog does with their photo challenges.

One of the stumbling blocks was discovery on this, being distributed, how you can essentially follow people who are participating.

One proposal involved creating a site you log into using IndieAuth and then that would be how you’d join.

I started contemplating simple webmentions. The same way you RSVP to an event…you should be able to create a page for a challenge and have it receive webmentions, which would generate the feed.

So, that is what I’ve been contemplating all afternoon since. The page would work like an old-style planet. A planet is a site that aggregates feeds from a variety of sources with a particular theme or community.

Using webmentions as a publishing avenue is what Brid.gy does. So, there are a few ways I thought this could work.

  1. Like the way Brid.gy does it, the post would be marked up with a u-syndication property, which would trigger a webmention to the page, but instead of it being seen as a comment, it would add it as an h-entry in a feed people could follow. To prevent abuse, there could be the same types of vouches/moderation you’d otherwise use. If you wanted to ‘take down’ a post, you’d use the webmention delete method.
  2. This would be the same, except using the u-category properties instead of u-syndication. So, why is this a thought? Because you are tagging it, but just linking it to a tag on another site. The argument for this vs u-syndication is that the syndication in this case is entirely at the discretion of the receiver…also it the URL is scoped to the feed, not to the individual post.

In both of these, it seems like a relatively easy thing to have your webmention receiver interpret this markup by generating an h-feed, either of reposts of the post, or a simple feed with just URLs to the individual posts.

This is something that could be easily built into any site that has webmention capabilities with a minimum of additional code.

So, have at it, what am I missing here?

IndieAuth Spec Updates 2022

Over the course of 2021, the IndieWeb community had several popup sessions to continue the refining of the spec. This culminated in a release of the latest iteration on February 22, 2022.

I really enjoyed Aaron Parecki’s post explaining the changes during the 2020 season, and thought I’d write my own this time using the same format. I’ve been heavily involved in the update, but Aaron is heavily embedded in the OAuth world to a degree I’m not, and may have more insights I hope he gets a chance to blog about.

Many of the changes bring IndieAuth closer to OAuth 2.0, ensuring that an OAuth client could support IndieAuth with a minimum of changes.

Metadata Discovery

The first thing an IndieAuth client does is discover the user’s endpoints and redirect the user to their server to authorize the client.

Previously, the client would look for HTTP Link headers for the authorization and token endpoint. As we continue to expand into new use cases, we need a new way to provide information to clients.

The new metadata object servers publish and clients retrieve not only identifies the location of the various endpoints(some of which are optional), but what the capabilities are of the endpoints.

Changes for Clients: Clients must check for a HTTP Link header or an HTML link element with a rel-value of indieauth-metadata. For the foreseeable future, clients should, for backward compatibility, still look for the authorization_endpoint and token_endpoint rel values.

Changes for Servers: The server has to publish the link values for the client to find, and at that URL return a JSON object with properties containing information about the various endpoints. You may wish to place it in the .well-known path, for compatibility with other OAuth 2.0 implementions, but this is not a requirement.

Issuer Identifier

In order to positively identify differing IndieAuth server, each one will now have a server identifier, indicated by the issuer parameter.  It is a prefix of the URL where the Server Metadata endpoint is.

This can now be checked to protect against attacks, as IndieAuth clients interact with multiple servers.

Changes for Clients: Clients must now check that the issuer identifer returned from the authorization endpoint is valid and matches the one provided in Server Metadata.

Changes for Servers: When the authorization endpoint builds the redirect back to the client it will include the issuer identifier. The issuer identifier will be provided through the new metadata endpoint.

Refresh Tokens

Refresh tokens are something were always permitted in IndieAuth, but people didn’t know it was an option because it wasn’t described.

Changes for Clients: Clients should note whether tokens have an expiry and be prepared to request new tokens using the refresh token process. The new metadata endpoint, if implemented, would advise if a server supported the refresh token grant type. The only negative to not implementing support is that when the token expires, it would be a poor experience for the user to have to reauthenticate.

Changes for Servers: Servers are not required to implement short-lived tokens and refresh tokens. But if they choose to, they would have to support the refresh_token grant type in order to allow clients to get new tokens when one expired.

Revocation Endpoint

The previous version of the spec overloaded the token endpoint to provide revocation with the action=revoke parameter.

Changes for Clients: Clients should support discovering the new endpoint through the server metadata endpoint and utilizing it.

Changes for Servers: Servers may wish to support the old revocation method for backward compatibility for the foreseeable future, but should implement the new endpoint.

Token Introspection Endpoint

This new version introduces the token introspection endpoint, discoverable through the new metadata endpoint. This replaces the previous token verification process with one based on the OAuth 2 Token Introspection process. This means also a change to the response.

The major difference between this method and the prior one is that the previous method was a GET request, this is a POST, and requires some form of authentication.

Changes for Clients: None….the token verification is meant to be done by resource servers, such as a micropub endpoint if not coupled with the IndieAuth endpoints. Some clients may have been using the verification process, and must remove this.

Changes for Servers: The introspection endpoint is also optional. The old GET option may be retained for a time, but it is best to discontinue as soon as possible as the previous verification endpoint was not meant to be used by clients.

New User Info Endpoint

A previous update to the spec added a profile scope and a profile return to the authorization response. This addresses the scenario where a client wishes to refresh that profile information by allowing for an optional user information endpoint, discoverable via the metadata endpoint.

Changes for Clients: Clients supported/using profile information should, if a user information endpoint is available may choose to query it periodically for updated information. This would allow for refreshing avatars and display names automatically.

Changes for Servers: Implementing a userinfo endpoint is, of course optional. In most cases, if you were returning the profile information in the authorization response, it should be relatively easy to add the endpoint.

Clarification of Profile and Its Scope

There were questions regarding the definition of the return values for the profile information, which were clarified in the update, and more significantly, the application of the profile scope…specifically, could you issue a token with only the profile scope and what that would mean.

The language of the previous update made some individuals believe that a token would not be issued if the request contained only the profile scope. This was clarified.

If you need a token, you would redeem your authorization code at the token endpoint…which would allow you to have a token with just a profile scope…which could work well for the new userinfo endpoint. If you don’t need a token, just to know the user logged in, you can do the same redemption at the authorization endpoint.

Change for Clients: This should be addressed as per use case above. Namely, if you need a token vs not needing one.

Changes for Servers: If you implemented this during the prior update, and set it so you could not get a token with a profile only spec, due a misread of the intentions of the specification, you should change this. It shouldn’t affect any client.

 

Bookmark Links Plugin for WordPress Ready for Beta Use

My creatively named Bookmark Links plugin for WordPress is now available for beta use.

This is an enhancement of the Links feature in WordPress, which has been disabled by default for a decade now. The database table for this still remains, though I’ve extended it with a separate metadata table, which uses the WordPress metadata API.

The fun of this project was trying to add all of the features to the Links feature that WordPress might have added if they’d continued the feature. So, all the enhancements made to comments, posts, etc. That means an improved interface(the admin list didn’t even have pagination), as well as a REST API endpoint, and more.

I’ve hooked it up to an Android app, which allows me to share URLs to it via the REST API and save them.

The original feature was designed as a blogroll…this is designed as a bookmark store. It has a built-in read later indicator, and various other pieces of metadata. I also added an import and export option just to cover myself.

For sharing with others, in the admin, there is an option to publish a single post as a bookmark post or multiple bookmarks in list form as a post.

There is a lot more I would like to do with it, but I’d love to see people using and suggesting input.

Meta Tags to Microformats

Earlier today, Jamie Tanna announced the opengraph-mf2 library and hosted project. It takes OpenGraph meta tags and converts them to microformats.

I do the same thing as one of the many pieces of my somewhat messy Parse This library. Parse This, which is designed to feed WordPress plugins, forms the basis of the reply-contexts in the Post Kinds plugin, the parsing for the Yarns Microsub plugin, and my newly released bookmarks plugin. In all cases, it tries to extract as much data about the URL sent to it, and return it in microformats 2 json, or the simplified jf2 format.

Jamie’s code is a simple 80 lines that takes a few tags and tries to convert them. I ran through every meta tag I could find by looking at dozens of different sites, so I was inspired to document same.

First of all, if you look at MDN’s definition for the meta tag, it states that if the name property is set, the meta element applies to the entire page, but if the itemprop property is set, that’s user-defined metadata. The content property contains the value for the name attribute. There is no mention of the attribute property in the HTML spec, but it is mentioned in the OpenGraph protocol.

I take name, property, or itemprop and map it to the key in an associative array, then content is the value. For values with curies(:), I use that to create a nested array, which is what I use to map properties.

There are common classic meta names that are longstanding and defined in the HTML specification, such as author, description, and keywords. If nothing else, this might generate some simple information.

Moving up a level to OpenGraph…there are several common metadata fields, namespaced with og.

  • og:title – this would map to p-name
  • Media – Some media has the :secure_url addition for the https version of the image. This is still used, although the modern utility is sometimes questionable.
    • og:image – this would map to u-photo.
    • og:video – this would map to u-video
    • og:audio – this would map to u-audio
  • og:url – this would map to u-url
  • og:description – this would map to p-summary
  • og:longitude, og:latitude can map to the equivalent location
  • og:type – The type is a bit harder to map, but can be used as hinting otherwise. Article as a type would be considered h-entry, profile would be h-card, music and video types would be h-cite.

Of the various types, music and video types are not really represented well in Microformats. So let’s focus on article first.

  • article:published_time – mapped to the dt-published property
  • article:modified_time – mapped to the dt-updated property
  • article:author – mapped to the author property

Many of the types have a tag property, that can have one or more tags…which get mapped to category.

Jamie opted to map the Twitter namespace properties as a secondary factor. I opted not to. The namespace is from their Cards specification, which is really just another OGP namespace. The problem is that they don’t provide an author name or website, only their Twitter handle. The majority of sites I viewed had both the og and the twitter namespaces, and I never got anything from the twitter namespace that wasn’t in the og namespace except Twitter specific details, which I wasn’t interested in. Facebook was responsible for OGP, so most people want to cover both sites, so they have both.

I did opt to look for the custom namespace for FourSquare venues, which is playfourquare, for latitude longitude. I also considered the presence of the namespace to indicate a FourSquare venue, and therefore an h-card.

  • playfoursquare:location:latitude – maps to p-latitude
  • playfoursquare:location:longitude – maps to p-longitude

After the OGP tags, I also looked for some other common meta tag names.

Some academic sources use Dublin Core properties in meta tags:

    • DC.Creator – p-author
    • DC.Title – p-name
    • DC.Date – dt-published
    • DC.Date.modified – dt-updated

Parse.ly, which is part of WordPress VIP, has its own markup.

  • parsely-title – p-name
  • parsely-link – u-url
  • parsely-image-url – u-photo
  • parse-type – post is h-entry, index would be h-feed
  • parsely-pub-date – Publication date
  • parsely-author as p-author
  • parsely-tags as the p-category
  • They also offer the property parsely-metadata for other fields which is json encoded.

I also convert JSON-LD to microformats, but that’s another story

 

IndieAuth for WordPress 4.2.0 Released

Decided to dive into the unknown with the IndieAuth spec. The WordPress plugin now supports the latest in the standard, some of which has been merged, and some of which is pending merge. This will be visible if you visit the spec repo, but has not been deployed to the spec page yet.

The first change is the introduction of the metadata endpoint. This means that instead of a Link header for every endpoint, there is one endpoint that has parameters for all the other endpoints. This means even if an extension like Ticket Auth(which requires another endpoint) is optional, it won’t require another header.

This is something we have in Micropub, where the media endpoint does not have its own link header(although there is a proposal to change that). But it does mean you have to make two requests(caching aside) instead of one in discovery.

The metadata endpoint also provides some configuration information on what the endpoints support, such as which scopes, which can be useful.

The introspection endpoint, introduced in 4.1.0, as a result, is no longer sharing a URL with the token endpoint. The side effect of needing to implement proof of concept….as the introspection proposal has yet to be merged. Until it is, it is considered experimental.

The new revocation endpoint allows this feature to be separated from the token endpoint as well. The old method still works for the foreseeable future.

The final endpoint added, the userinfo endpoint, is just a way of getting a refreshed version of the profile info returned when you make the initial request. This also being experimental till merged.

All of this, as well as some minor tweaks and optimizations, works, and is fully backward compatible. At some point in the future, when adoption changes, will be looking to deprecate older methods.

All of this is a step along the way of making IndieAuth not so much a separate protocol, but what it is described as….an identity layer on top of OAuth 2.0(or increasingly on top of the proposed OAuth 2.1), with the changes meaning less custom code.

IndieAuth Popup 2 2021

In October, we had a second IndieAuth popup session to finish off what we didn’t in the first one.

Some of the items from the first popup remain unmerged due some questions, but a lot are affected by the now merged Metadata Endpoint.

The idea is this…instead of having multiple values in the header, you have one value, rel=”indieauth-metadata”. This, URL, when retrieved, provides a full JSON configuration for all the IndieAuth endpoints. The old headers will have to stay for a bit for backward compatibility, but eventually can go away.

This changes the idea for the introspection endpoint, which no longer needs to overload the token endpoint. The same can be said for the revokation endpoint, which is an overload of the token endpoint. So both of these can have their own endpoint. This would in the future deprecate the existing methods of doing this(such as action=revoke).

The OAuth2 Server metadata spec, which we adopted with minor modifications, has fields for all of these, so we can simplify the IndieAuth standard and make it more OAuth2 compatible.

The idea of moving closer to OAuth2 means existing OAuth2 clients can be modified with a minimum of issue to work with IndieAuth.

The as yet unsolved problem for me is that the revocation and introspection specs we’re adopting are rather similar…both use POST actions, with the parameter token. However, they both require authentication. It was decided that how this works would not be specified at this time.

So, this makes it a bit hard for my implementation, as I haven’t decided what out of band method I’ll use. I may leave it unauthenticated for now with a warning.

The final addition is the pending proposal for a User Information endpoint, to also be added to metadata. This would have the same return as the profile property during the flow, and allow any token that had the profile or email scope to refresh its profile data without having to go through the flow again.

 

IndieAuth for WordPress Version 4.1.0

IndieAuth 4.1.0 was Released. One of the biggest changes were fixes as a result of the doubling of unit tests, which allowed several small scenarios to be troubleshooted, as well as other under the hood fixes.

In addition, the introduction of refresh tokens and an introspection endpoint, as discussed in the last Indieauth popup.

Finally, an experimental ticket auth endpoint, which is disabled by default, which can be enabled by added define( ‘INDIEAUTH_TICKET_ENDPOINT’, true ); to your wp-config.php file.

Ticket Auth is an developing spec that I’ve commented on before. This endpoint received tickets and redeems them…and at the moment, nothing else(which is why it is not enabled by default). That and the fact that we have not yet iterated on how to use the token once it is received.

IndieAuth Popup – August 2021

At the end of last month, a group got together to discuss some of the outstanding issues in IndieAuth. We had two similar sessions in 2020 and the specification is better for it(see Aaron Parecki’s summary of changes), however, we left many issues on the table due time considerations.

The spec hasn’t been updated yet, but here are some notes on the new changes. We’re hoping to follow up at another session before the end of the year.

  • Tokens having an expiration is now recommended, but not mandatory. This would mean the token endpoint would return the ‘expires_in’ parameter as part of the access token response, indicating the number of seconds till it expires.
  • In order to support expiring tokens, the access token response may now return a refresh token. The token endpoint will support the grant_type refresh_token in order to utilize these tokens to get a fresh access token. Refresh Tokens are common in OAuth2 and there would be no changes from the existing specifications.
  • Adoption of the OAuth2 Token Introspection Specification, with the token endpoint acting as the token introspection endpoint. At this time, that spec requires authentication to use the endpoint, whereas IndieAuth’s existing token verification interaction does not. This question is ongoing, as to whether we should specifically drop the auth requirement.

Bathroom Renovation – August 2021

At the end of last month, for a few days, I had my bathroom fixed. It isn’t visible in the pictures, but the metal tub was rusting through around the drawn, and there were other issues.

The bathroom had a half tile wall around, a soffit over the shower where ductwork ran from the building heating system to the exterior vent, and a few other challenges.

The renovation gutted the bathroom down to the studs and the cement, where I discovered there was none under the drain…just dirt, and the source of one of the problems. There was leaking from the bathroom above, which had to be fixed as well.

The bathroom was original to the building, built in 1976, while the blue was a nice enough color, it made it hard to replace easily. I had no spare tiles. So, in replacement, I opted to only retile the shower stall itself in white subway tile(something easily to get for years to come), and just replace the sheetrock elsewhere, so it could be repainted in a few years as needed.

I’m too tall for the bathtub, so hadn’t really used it, so went with a shower pan and doors.

This is a small bathroom, so I also had the floor retiled in black slate, and replaced the toilet(which was last replaced in the 90s) with a dual flush toilet…which was illegal in New York City till 2010, and a mechanical bidet toilet seat. I previously had added a mechanical bidet attachment. While some people have issues with bidets…I save a lot of toilet paper by using it.

Mechanical bidet seats, or attachments work by splitting the cold water intake on your toilet. The fancier models may branch off your hot water, or may have electric features, but that was unnecessary for my purpose.

The biggest change to the bathroom that I’d wanted for years was electrical.For one, I have an outlet inside the cabinet over my toilet, which allows me to charge my shaver and electric toothbrush inside.

There was originally a single light fixture over the sink. That is gone, replaced by three ceiling lights. These are disc lights, which are the latest replacement for traditional recessed lighting. A hole is still cut in the ceiling, and a junction box placed inside to convert to the appropriate voltage, with the disc light being connected and clipped into place. This allowed for one to be inside the shower itself, which was always dark, one in the middle of the room, and one over the sink. They are also temperature adjustable, if you remove them and adjust a switch, if I wanted.

Finally, the switch for the exhaust fan was replaced with a timer switch with a built in humidity sensor, so it can be safely left on till the humidity returns to lower levels. I previously just used a timer switch, this gives it a little more intelligence.

I think the final result turned out well. Nothing I bought was particularly expensive, individually, and I tried to pick things that I thought would hold up. I even got them to run a network cable through the wall between the rooms on either side, for future proofing, before they sealed it up.

 

 

 

A Website Refresh And Dark Mode

I’m often adding features and functionality to my website. A location tweak, a new link, etc. But it’s been a while since I did anything major to the layout.

When I initially heard about dark mode support, I decided to wait until there was more support, then I just didn’t get around to it. It became a thing for applications to have dark modes, then dark modes that would activate based on a global system prference.

So, now my site, if you set your system to prefer dark mode, will show you a dark version of my site. Otherwise, it will show you a light version. I took a lesson from Jeremy Keith, who did this two years ago, and used something called CSS Custom Properties…another thing I haven’t used.

My WordPress theme is based on the original Twenty Sixteen WordPress.org theme. I ported back select improvements made from _s, the starter theme it was based on, as well as subsequent WordPress.org themes, such as Twenty Seventeen through the present.

There are a lot of other little tweaks I had to make in both this, and the plugins I develop for WordPress to support this. Style improvements, filters to add for additional functionality, etc.

It is still a work in progress, and I have other ideas and plans, but it is live. See if you can find all the other little tweaks.