< Tag 100days

Today's internet is built around silos: websites want to be the place you have to come to see update. For example: you have to come to Twitter to check out someone's post, and while you are there you may stick around for a while. RSS is different. RSS is a format in which a website publishes its posts, with a specified title, description, a link to the original post, and maybe some other details. You use a program that checks this list regularly, and it will show you when a new item appears, like how your email inbox may show a collection of emails from many sources.

That's what makes RSS different. Rather than having to check periodically on your own, your reader checks for you. You don't need to check twitter regularly, and while you are there get stuck scrolling recommended items. RSS puts all the new stuff together for you to easily browse at your own pace.

A lot of sites still support RSS. It's a great way to browse content as well. There are many online feed readers, such as inoreader, feedly, and miniflux. Personally, I host my own version of tt-rss. These services let you sync feeds across devices. There are local only programs like Thunderbird and flym for android that are generally completely free, but don't sync as easily. Email me if you're interested in using tt-rss, and I can give away a few accounts on my instance.

Once you decide on a reader, you can paste a site's page into the reader to subscribe to it. You can subscribe via an RSS link, which may be included at the bottom of a page, or usually just with the normal URL (the reader will look for an RSS link automatically). Additionally, I have created this Firefox extension which will popup with RSS links on a page: https://addons.mozilla.org/en-US/firefox/addon/rss-bridge-helper/.

My extension does more than just show normal links. If you search for a public RSS-bridge URL, you can also use it for Twitter, Instagram, and Youtube feeds. These sites don't provide RSS natively, and RSS-bridge is a website that generates feeds for other sites so they work in your feed reader. Youtube does have some native RSS support, but they do not advertise it.

RSS is a way to decentralize the web again. It can help us move beyond just scrolling on one website controlled by ad-tech businesses. It works in for your interests, rather than forcing you to work within the confines of a particular website.

Sun Nov 29 100days 👍 (0)

I've updated my git server and written an info page about it. Check out the info page link for a post about it, I'll leave the post there rather than copy it all to here.

I've been pretty busy with school projects this semester. I'm taking more courses than normal in an attempt to finish an MS degree this year, and my TA position is also busier this semester. Because of this, I haven't been writing blog posts much, but I expect that once it winds down I will get back into it.

Sun Nov 15 100days 👍 (0)

For my personally wiki I use wiki.js. Prior, I used to use cherrytree, which is a hierarchical note taker. However, I wanted to use it on mobile, and there is no way to do so with cherrytree files. So I looked for personal wikis. I wanted something that would let me make pages private, as I like to use it as a general purpose journal and it does contain private information. wiki.js seemed like a good solution, and it had good mobile support, or, so I thought.

wiki.js works great on desktop, and you can read pages easily on mobile. The editor on mobile is horrendous. It does not use native editor elements, and so there are many race conditions when typing with a mobile keyboard. It is so laggy and characters do not show up correctly. I have to type very slowly for things to actually work.

My solution to this problem was creating wikijscmd, which is a command line interface for wiki.js. wiki.js has a graphql API with no documentation, but after spending some time figuring out how to use it, I was able to send these queries from python. I wrote a simple wrapper to create, edit, and view single pages, and also get the path tree. It works very well, and I am actually using it right now to draft this post. It opens your VISUAL editor if set (like how git commit does) for editing, and I do prefer to write things in vim. While it still nice to have the wiki on the web where the formatting is nice and browsing is easy, but I like having a command line version too. In the future, I'm planning on writing some email scripts so that I can send new pages via email, which would make mobile editing very straightforward.

Sat Oct 17 100days 👍 (1)

There is this idea of "Daily Devotions," which typically refers to a Christian practice, but I want to extend into other areas of life. In my last post, I talked about a script I wrote that sent me a daily update email with news from 100 years ago via the Library of Congress newspaper archives. I wanted to add more to the email, and this week I read Austin Kleon's blog post on the same theme and it inspired me. I really enjoy seeing something that was meant for today, in a different time, so I extended my daily email with a few more things. Now it includes the daily wikiquote, the wikipedia page for today, and a Calvin and Hobbes comic. Each of these give content meant for a date, but they all are from a different place, and are different kinds of devotions.

The wikiquote is chosen by a user of the website, and while interesting, quotations do not really have a place on a date. A good quotation is timeless, and so doesn't belong tied down. It is a devotion to some philosophical notion, but usually a shallow one.

The wikipedia page for today contains a large historical perspective on a date. Everything "important" that has ever happened on a given day is recorded there: events, births, deaths. This gives some factual knowledge to the reader, but again I think it is a bit shallow. It is not insightful to know precisely what day an event happened on usually.

Comic strips are more interesting to me, since they are mundane. It is an artistic expression, and over time they piece together to form a more meaningful thought. They have stories and plots, and they serve as seasonal reminders. Calvin and Hobbes in specific is very nostalgic to me, and even more so are the winter time series, showing a joyful Calvin enjoying a dreary time of the year. Bill Watterson created something that is entertaining to kids, but with a lot of meaning for adults as well. While comic strips may be pulp, I can understand one in a devotional sense.

Daily news from 1920 is also mundane, but I would not classify it as artful as comics. Reading contemporary accounts of history helps me stay grounded. In my email, I specifically get "The Belding Banner," which is the closest newspaper to my hometown the Library of Congress had. All the news that affected the people who lived in the same place as me has come and passed with little of it being remembered. It is important to still be able to look through it. It reminds me that so much of what I might be worried about will pass too.

Mon Sep 28 100days 👍 (0)

Email is great. It is a time tested standard, reliable, and federated. Today I spent some time cleaning up a couple scripts I have regarding email. Ever since I set up my own server with mailinabox.email/, I have been scripting some IMAP and SMTP things. It is much easier to do things with a custom server than with a large email provider like Gmail since you don't have their proprietary system on top of everything. The scripts reside in this repository.

email_helper.py

This file handles all the shared code regarding IMAP and SMTP, like sending and parsing the inbox. I did not have this before today, and foolishly was just copying the send mail method in every script. In refactoring this, I accidentally sent out an email about an old post to all the subscribers on my website from the email sign up (sorry about that). I'm not quite sure how old this post was, as I don't show years for some reason in posts, which is something I need to fix.

config.py

Manages the config file for everything. If no config file exists, it will prompt you to create one with a creation tool.

kitchen-update-email.py

This script is fairly specific, and it checks if there is a new post on my website and if so sends out the email. Most of this code is DB logic, since that's where the information for when the last email was sent is kept. I run it in crontab, and so emails are send when my server hits midnight.

reminder.py

This is where things begin to get a bit interesting. I set up this script to send an email starting with the subject "REMINDER: " with the rest of the subject and body coming from the command arguments. This makes it really easy to set up a crontab line with the script for things like "pay rent near the end of the month" or "backup computer on the 5th". I can see the email in my inbox when it comes, and it is a reminder to do that thing.

daily-update.py

This script does the most, and I run it every day. First, it will check my inbox for any emails that contain "REMINDER" in the subject and include their subjects in a list, just so the unread emails don't get buried in my inbox. Next, it will go to the Library of Congress newspaper archives to find a newspaper from 100 years ago and if there is one it will appear also in the email. I find old news to be fascinating, and in the future would be interested in including some more "this day in history" features. Lastly, this script will also include a detailed weather forecast table. I have more ideas of things to include, and I am excited to extend this daily email with even more things.

Sun Sep 20 100days dev 👍 (0)

I watch more YouTube videos than I care to admit. YouTube's site is addicting and distracting. It is so easy for me to spend hours watching videos. I don't want to quite YouTube entirely, but over the last few years I've been trying to find ways to still watch some YouTube videos, mainly from my subscriptions, without getting sucked into the recommendation vortex. On top of this, there is the privacy implication and the act of building up Google's internet empire, but this is for another post.

youtube-dl

If you are techy, you can use the command line tool youtube-dl to download all videos you want to watch and then open them in your media player. This is a fairly annoying workflow in my opinion, and it still requires you to go to unfettered youtube.com in order to find video URLs. I personally find this workflow to be too abrasive to use.

youtube-xspf

Many media players (VLC, MPV) can play from a YouTube URL directly. They can also do this from a playlist (xspf) file. So I created youtube-xspf which will take the generated XML file of your subscriptions from the subscriptions manager and generate a playlist file of videos for you to watch, which you can open in your media player of choice. This actually is a pretty nice way of browsing YouTube, but I still find it a bit clunky. Sometimes the VLC stream doesn't open either, and so you can't watch a video.

NewPipe

The android app NewPipe (available on f-droid) has a pretty similar workflow to the above method, but it uses its own format for feeds rather than xspf. It is a great app, but since its mobile only, I don't see it as a proper solution. NewPipe does a lot more than the xspf solution, like showing comments, descriptions, and some recommendations.

Invidious

This is an alternate frontend to YouTube that recently shut down its main instance (though there are still more listed here https://invidio.us/). It allowed for a YouTube experience on the web without the bloated website. I had issues with running my own instance and managing subscriptions, and so I had to switch to the next solution for my needs.

RSS feeds

YouTube provides RSS feeds for each channel automatically. So one option is to use this in your feed reader of choice. This is my preferred way of managing YouTube currently. I can use my feed reader to sort these items, and I save the feed entries to watch later. I can watch either in my media player or on youtube.com. I've been using this with extra UBlock filters to block out most parts of the website that aren't just the video to keep me focused. This isn't as dogmatic of blocking out YouTube as using a media player, but it does help me change my habits. Instead of going to the home page of YouTube to check for new videos, I just go to my feed reader. It helps me stay less distracted.

Peertube

Alternative video sites seem really cool, but there just isn't enough content at the moment. Perhaps there is, but another problem with these sites is discoverability. It is so easy to find niche interests on YouTube, they are practically shoved down your throat with repeated recommendations. I do follow a few Peertube RSS feeds, but I have not found myself able to rely just on these for entertainment.

Sat Sep 12 100days 👍 (10)

A while back I wrote local-podcast-generator as an Android app. It is just a fork of a simple http server that generates an RSS feed for the files in a given directory. There are a few more things I need to work out in it before it is fully released.

What is it?

You give it a directory on your phone, and it creates a podcast feed based on the files in that directory. You can take that feed and add it to your favorite podcast player. It does not recursively explore down the file tree.

Why?

Let's say you have a series of audio lectures or audio books downloaded to your phone. You've tried to listen to them in a media player, like VLC, but this isn't optimal. Media players don't keep track of which episodes you have listened to, and they can lose your place. If you take time off from listening, you can come back to your podcast player and pick up right where you left off. Podcast apps are designed for this type of long form, episodic audio content.

I've tried to listen to librevox audio files in VLC, but it really is just inferior to a podcast app. That's why librevox offers rss feeds for their audio books. But some of my audio content comes from CDs, and so feeds don't exist for them. Hosting them as an RSS feed seems to be overkill, and would require you to download them over network. Running the podcast generator app keeps everything local.

Download

Get the apk release from here. You should hopefully be able to install that (I'm still figuring out how to build and distribute apk releases, so let me know if there are any issues).

Wed Sep 02 100days dev 👍 (0)

This summer, I taught my first class, an intro to programming (procedural Java) course at my university, for students with no background in the area, open to students of any major. I was a TA for the two semesters prior, and at my undergrad I was a general tutor for the computer science department. However, this summer was my first experience leading a class, and the first time where I was the lecturer. The course was entirely online, which brought more challenges. I wanted to share some of my experiences from this time, and some of my thoughts on teaching in general.

Teaching Java

Java is a great language to learn at first. This may be my bias, as it is also my first language, and so it holds a special place in my heart. Lately, a lot of people suggest Python or Javascript as a first language because they are "easy" to learn. I disagree entirely. Python and Javascript are caught up in fast changing usage, the definition of "pythonic" changes, and so does ECMAScript. If you look up a programming question for one of these languages on stack overflow, you will find so many answers it is overwhelming. Java has less syntactic sugar, and so the syntax is easier to learn. Additionally, in both of these languages, looking up a question will likely instead give you the name of a recommended library function. While learning how to use libraries is important, it is better to start with learning a few in the standard library of the language. Programming beginners are already overwhelmed, and adding in 3rd party libraries extends a programming language too much right away. Lastly, Java is also statically typed. Static typing is great when you are learning programming, since the compiler will offer corrections.

Teaching Java to start does have some difficulties too. The course got into basics of source code vs byte code, which makes some sense, but Java throws in a virtual machine here too. The class also focused on procedural programming. Java requires you to have a class, which is confusing. Some Java editors automatically add packages, which causes problems with the unit test grading.

Creating Lectures

Creating lectures is difficult. I required the reading for a chapter to be done before the lecture on that chapter. That way, most learning could be done from the reading, and the practice examples in the textbook would reinforce this. Then the lecture would be for clarifying things, going deeper into concepts, and showing examples that would help them with projects.

My philosophy was to give students a lot of practice seeing code problems. Much of lecture was live coding in front of students. I think the students who did show up live and could ask questions about these examples, benefited the most. It was great when students asked "what if you changed it to..." and we could find out together. As a programming, my instinct is to run code when I have these questions, but this is something that has to be taught. So any question I was asked, I tried to answer by writing code.

I also focused in lecture on the patterns behind programming. When writing a new method that iterates through an array for example, often there is a similar pattern: initialize a variable, loop through the array, return the variable. Similarly, there are simple patterns for things like using Scanner: import, create the scanner, use the scanner. If students can remember these patterns, they can mix them together to write all the required programs. As they learn more, they begin to understand what these patterns are, once they learn things like what using an instance method really is.

Managing people

I had a few TAs that I worked with, who were my peers in prior semesters. I do not think I managed them very well. Primarily, their job was to hold office hours to help students with homework, and this went fine, though I should have asked for more help with creating assignments and review material. I got these jobs done fine myself, but it may have benefited to have examples created from the minds of multiple people.

It was also my first time managing students. As a TA, the only conflict I had was with regrade requests. As an instructor, you are exposed to many more situations. Students have emergencies, forget things, and communicate too late. Usually, I was very relaxed with policies. When something came up, I would think of what is the best for the student's education. I find it easy to be a stickler about rules, but very soon into the course I realized this is not helpful for learning. If a student asks for an extension on an assignment they missed, it is better they do it late than not at all, especially in programming, where so much learning comes from the actual practice of programming.

Sat Aug 22 100days 👍 (0)

If you are like me, you download dumps of strange data archives. I have old ebooks, internet zines, saved web pages, and more that I have saved to my computer for my later usage. This is what I do rather than bookmark an article or video, I’ll manually save it (see my last post on how things do not last very long on the internet). There is a problem with this: it is not very nice to browse huge folders, usually you will check just the first few things. Another problem: even though all of these I’ve downloaded provide me with more content to enjoy than I ever could have time to, I spend almost no time at all enjoying them. Rather, I’d just open up some social network feed (reddit, twitter, hacker news) that does the hard work of curating for me (by telling me what to look at now, and more things will show up later).

So what is a solution? A local web server that randomly will suggest things for me to check out from my hard drive, and displays them in a timeline format mimicking these popular websites. Instead of opening a link to a website when you click on an item, it runs an OS command to open it in your default application (so audio files open in VLC for me). It works great for me, rather than browsing a social media site, I can check out the content that I decided was interesting enough to save for later. Right now, it does need a few more features. Namely, authentication (so someone doesn’t send a bunch of requests that open a ton of documents on your computer) and filtering (so huge dumps can be ignored). The random algorithm could also be improved, by comparing access times to created times to find obscure files you have never even opened. Overall, it is an interesting project that helps move away from algorithmic and hyper fast content that I find is toxic to my attention span and mental well being. While it is a simple program at the moment, it has a lot of potential for future features.

Mon Aug 10 100days 👍 (0)

This week I created my own bot for the social network Mastodon. The idea came about when I was using Hacker News "past" feature, which shows an archive of older posts sorted by date. So, my bot will post each day three times the top posts from one decade in the past. Here is a link so you can check it out: the bot

It is interesting and grounding to see what was going on one decade ago. Will the top posts be about some new exciting tech that has since been completely forgotten? Or perhaps some contemporary blog post that still remains insightful? The comments on the articles remain a time capsule of a different internet age.

This experiment shows how temporary the internet is also. So many of the links are to content that doesn't work. Even if the website is still operating, images or video the articles are often missing. If you want more proof of this, check out your youtube favorites/liked playlists. Odds are it's riddled with [Deleted Video]s, with no information about what was ever in its place. It's a reminder to download anything that you truly wish to keep around.

Onto the code

The Mastodon.py library provides a python wrapper for the Mastodon API. The documentation for it provides an OK explanation at setting things up. So to clarify, there is one piece of code it gives to generate a clientcred.secret file. After generating this, you will generate a usercred.secret file. You then use this to log in when you want to use your bot. While this is simple after the fact, it does make it so you have to run separate code to set up the secret files. Here is a link to the source file for my bot, but keep in mind it requires you to have generated the secret files.

After the setup, you should be ready to use any of the docs to interact with Mastodon as you please. Sending a string as a toot is about as simple as it can be. I generate the string by parsing the HTML from Hacker News with Beautiful Soup, finding the link to the story and the comments. I could not find a way to use the HN API to view past posts, so I am just requesting the web page. Since I am only doing this three times a day, this shouldn't really be be causing any stress on the system. I set it up so the bot takes an argument from 0 to 2, for which post it should use (the first through third). I set up cron to run three separate times, each calling the script with a different argument. This was simpler than parsing which hour of the day went with which post.

The Good Old Days

Writing this bot was very easy for me, and it was nostalgic. In highschool I ran multiple twitter bots that would send out all sorts of things. It has been over 5 years since I last touched social media bots (though I have been working with telegram bots), and my programming skill has improved hundreds of times over. I struggled with understanding the OO paradigm back then, and reading documentation was hard and dense. I am able to look back on my old code with some fondness for how hacked together everything seems, piecing together random bits of various example tutorials. Luckily, I have them all saved, and I know they won't just drop off from the internet ;)

Sat Aug 01 100days 👍 (0)

I host a variety of server applications. Some of these are public, like my website, but many are intended just for myself. Self-hosting services allows me to remove the need to use a lot of proprietary websites, and it keeps my data private. Many of my needs started out as local offline applications, and I migrated to using online services that allowed me to use them from anywhere. Setting up servers requires a bit of technical knowledge, but this knowledge is best learned by attempting (and failing) in low stakes environments. The first services I ever ran were twitter bots that ran each day on my raspberry pi model B. These scripts were only a few files total, and if I ever messed up, I could simply reinstall the operating system.

VPS vs local server

I started out with running things off a local server, either a raspberry pi or an old laptop. This appealed to me most, since it was the cheapest. However, now I run most of my things off a VPS (virtual private server) that I rent from DigitalOcean. Running things locally can usually be cheaper, if you have old hardware lying around. Your home network does not usually have a static IP, and so accessing your services from another network can be difficult, not to mention to security risk involved opening ports. While a VPS costs me a monthly fee, I feel more comfort in knowing if my services are attacked, my home computers will be safe. Not having to run my own hardware is nice too, I do not worry about the drive failing on my VPS. In general, I actually run a mix of both. Anything I want to use from any network is on my VPS, and anything I just need at home is run locally.

What I host

VPS Services

Local Services

Tue Jul 28 100days 👍 (0)

I want to write more. I keep a diary, where I write mostly short things about each day. But, it is hard to be intrinsically motivated to write things that only you will see. This isn't a problem on the internet, where anyone can post what they want. Writing longer blog posts seemed hard, so I joined fosstodon.org (@markp), a decentralized "twitter" social network (I want to avoid twitter due to its algorithmic curation), There, I could write shorter things (which didn't seem like such a big commitment), and get some social engagement which has been hard to come by lately. On fosstodon, I saw a lot of people posting about 100 Days to Offload, which is just encouraging everyone to just write on their blog, no matter how well put together the posts are. Specifically, the challenge is to write 100 blog posts in a year.

The premise is that writing is a habit that requires effort and time to learn. So I've decided to use my blog for this challenge, and to start posting much more often than I have been. My goal is to hit 50 posts in a year, which is only one a week. Not only is this very achievable, and it gives plenty of time to produce content of some quality. 50 posts leaves no room for excuses, and if they go well enough, I may still have time to hit 100.

There is a lot I want to get out of this challenge. Firstly, I want my blog to have some posts that I am proud of and that are helpful. Right now, it is mostly some minor showcases of my projects, but none of them are really that interesting or insightful. Second, I want to get better at editing my writing. It is something that I have always avoided properly doing (if you go back to my early posts, there are some obvious grammatical errors). Thirdly, I want to practice being creative and do other things than just consume. It is much harder to be attentive and focused on creative endeavors, especially after years of putting them aside.

If you are like me, and have always wanted to write and create on the internet, why not start now? If what you make ends up awful, oh well it was just part of that weird challenge to pump out content. The potential for something great to arise, even if unlikely, is worth the effort.

Wed Jul 22 100days 👍 (0)

I've been interested in open source software for a long time. When I see some proprietary tool that I think looks useful, I enjoy writing my own version of it. These clones are usually fairly limited, but since I am the sole user of most of these projects, that is fine. They are good enough for what I need and want.

Budgeting:

I have a simple website that stores a table of my transactions. I wrote some basic queries to also have tables by week, month, and year. Prior to this, I wrote an app where I entered the transaction details that would then be sent to myself as an email. I wrote a Google script to scrape my gmail inbox for these emails, and then parse them into a Google spreadsheet. My app was easier to use than entering data into a mobile spreadsheet. While it seems complicated, the underlying programs were fairly simple (~60 lines of Java code for the app, and 25 lines for the Google script). I moved away for this system to both get away from google, and because spreadsheets were not the easiest to work with (I'd rather do the complicated stuff in SQL).

Blog:

I wrote this website with my own backend for a few reasons. I didn't want to be locked into any system like WordPress. It also gave me a chance to learn more about creating web servers and sites. I wanted something with more power than just a static HTML site, and something that would be fun. My original idea was to create a website where I only posted pictures of bread. I expanded this into much more, where I now have a blog, games, and a few other misc. projects.

Life Database:

There were a lot of things I wanted to keep track of my progress with, and so I wrote a website to keep track of them, mainly movies I've seen, books I've read, my music collection, and bookmarks. All of these have popular websites that can be used, but again I wanted to keep my data local. I have been finding myself wanting something even simpler, and in 2020 I have just been managing these things in cherrytree (a hierarchical note taking application).

Games:

I do like to recreate bits of games (simple bits at least). The one that fits best into this blog is Quiz Bunny, which was similar to some ad ridden app game that has the same premise. It took a day to recreate, and it was quite a new program for me. I had not made anything "multiplayer" like it really before. I created the basics of correspondence chess on my website, with a graphical board that you can interact with, however it is still very alpha of a project since the rules of chess are trickier than they seem.

Home Weather Station

Here we get to the real reason for this blog post. I love weather data, and I wanted a home weather station. I wanted to know my home's temperature, pressure, humidity, and the trend of these things. There are many existing products that do this, starting at about $60 for the most basic of items. That's when I considered just building my own station instead. I purchased a Raspberry Pi Zero W, an Adafruit BME280 sensor, and the needed peripherals (SD card, PSU) for just 40 dollars. I used the Adafruit module to connect to the sensor, and logged its data to a file. Then, I wrote a web server in python that plotted this data, so I can view it whenever from my local network. It works great, and I have much more control over this data and how I want to use it than I could with any consumer device.

This was my first experience with doing any sort of electronics work where I had to create my own circuit. I had to purchase a soldering iron, some wire spools, and other things needed as well for this hobby. These did bring the price up to more than just a consumer device, but now I am past the hurdle of not having supplies. There are other electronics projects I want to tinker with in the future, with the next being setting up a plant monitoring station.

I love software and computing, but I also want to branch out. The reason I got into programming in high school was because it was free and infinite. Now that I have some disposable income, I can now search for hobbies that are in the "cheap" category, and the hardware community has a lot of help for beginners (knowing programming well helps even more). Electronics is also easy to do in my apartment, in contrast to other things like woodworking.

Content Note

I'm planning on posting about weekly, inspired by #100DaysToOffload (which is 100 posts in a year). I am aiming for 50 blog posts a year at the moment. My next post will probably be about this.

Sun Jul 19 100days 👍 (2)