Installing and using Xdebug with Homebrew, Valet, and VS Code in 2021

As I was working on a client project, I needed to do some more in depth debugging in a PHP application. Laravel has a couple of great utilities like dd() and dump() that make the output of something like var_dump() much more readable and actionable, but sometimes it’s not enough. There’s another tool that came from the Laravel ecosystem called Ray that’s also worth checking out. It’s kinda like dump() on steroids and well worth the money in my opinion. However, the king of all debugging tools for PHP is still hands down Xdebug.

Getting Started

Before diving in, it’s important that we’re all starting from the same place in regards to tools.

  • You’ve installed and configured Laravel Valet and it is serving sites from somewhere on your local machine.
  • You’re using Homebrew in conjunction w/ Valet to manage PHP versions (default for Valet)
  • You have installed the excellent PHP Monitor utility application.
  • You have installed the Xdebug Helper Chrome Extension
  • You have installed VS Code
  • You have installed the PHP Debug extension for VS Code

Installing Xdebug

Xdebug version 3 was released in late 2020 and configuring it has become much simpler assuming you are using the defaults. The documentation on the Xdebug website will work for a lot of folks, but if you use multiple PHP versions via Homebrew, sometimes the tools can get confused when installing.

First, we need to find the real path of your PHP installation. On the command line run this command:

brew info php # default php version, 8.0 at the time of writing
# or 
brew info php@7.4 # or php@7.3 etc

This will spit out a bunch of stuff you can ignore for the most part, but we want this line:

/usr/local/Cellar/php/8.0.3

Your installation path might look different depending on your PHP version or Homebrew configuration. The important thing is that we want to go into this folder and utilize the utilities within it to install extensions using the pecl utility. By using this path, when we run pecl, it will know the correct location to find and modify configuration files.

Next, we’ll install Xdebug via PECL

/usr/local/Cellar/php/8.0.3/bin/pecl install xdebug

This will take a minute or two depending on your machine and internet connection. When it’s finished you should see this output somewhere near the bottom of your terminal.

install ok: channel://pecl.php.net/xdebug-3.0.4
Extension xdebug enabled in php.ini

If you see these lines, congratulations, you’ve successfully installed Xdebug!

Now, we need to configure one more thing. By default, Xdebug starts in develop mode which isn’t super useful. It enhances the default error screens and var_dump() output with better formatting and stacktraces, but doesn’t enable us to do step-through debugging.

Run the below command to find your php.ini location.

php --ini

This will show you all of the configuration files currently being used by PHP. By default, the Xdebug configuration is added to php.ini. Open that file in your favorite text editor. Mine was located at /usr/local/etc/php/7.4/php.ini

At the top of that file, just below the zend_extension="xdebug.so" line, add xdebug.mode="debug"

This tells Xdebug to start in debug mode, for, well, debugging!

While you’re on the command line, go ahead and run pecl install redis if you work on projects that utilize Redis. The PHP extension is much faster than the predis/predis Composer package.

You can restart Valet with the following command and then use PHP Monitor to confirm Xdebug is installed. You must restart PHP for new extensions to be enabled.

valet restart
PHP Monitor Settings

You can also verify Xdebug is installed by running php -v in your terminal and verifying the with Xdebug v3.0.4, line is included in the output or by running the phpinfo() function and searching for Xdebug information in the output.

Configuring VS Code for Debugging

Before we begin configuring VS Code, we need to tell communicate with Xdebug that we would like it to operate in incoming web requests. Navigate to one of your PHP projects in the browser, and enable the Xdebug helper Chrome Extension for that site.

You may need to click on the Puzzle Piece icon in Chrome to show all of your installed extensions.

This debug setting appends a Cookie to every request made from this URL that is specific to Xdebug. When Xdebug sees this Cookie, it knows to prepare itself for a debugging session. This allows us to essentially bypass Xdebug when we don’t need it so our web requests are still lightning fast. I believe this is a new feature (or at least exposed more directly) with version 3.

You can verify the Cookie is being sent by inspecting requests from the site in Devtools. The XDEBUG_SESSION cookie is the magic sauce! 🔥


Now that our browser is sending the correct Cookies and Xdebug is installed, we can configure VS Code to listen for incoming debugging sessions. This is an important thing to understand. When debugging is enabled, Xdebug initiates the connection to your IDE, not the other way around. We must configure VS Code to respond the location Xdebug expects (or change Xdebug to connect to different ports/hosts). We’ll be sticking with the defaults for this tutorial because it’s easier.

Open your project in VS Code and enable the PHP Debug extension I referenced earlier. Open the debug tab in VS Code and click the gear icon near the top of your sidebar.

This will generate a launch.json file for you in the .vscode directory at the root of your project. You can choose to commit this to version control or not, though I generally would not recommend it since Xdebug configurations may be different across your team.

The only thing we need to change in this file is the port that VS Code is responding on. The correct port for a default install of Xdebug is 9003. You can change it for both blocks in the config file.

After this, I generally find it worthwhile to completely restart VS Code. After restarting, open the Debug tab in VS Code again and click the green arrow to “Listen for Xdebug”.

We can now add a breakpoint by clicking the red dot to the left of any line of PHP code. I recommend adding it to a variable or first line of a function.

The default public/index.php file of a Laravel Project.

Now, open your browser, hit refresh, and hopefully VS Code will automatically focus on your breakpoint! You can disable VS Code breaking for “Everything” by unchecking the relevant box in the “Breakpoints” section of the debug sidebar.

Conclusion

Hopefully this has been helpful to you to enable Xdebug while using Valet on a Mac. If you have any questions, feel free to reach out to me on Twitter! If you want to add something that I may have missed, reach out and I will update the post and give you credit.

PHPUnit and Laravel Config Caching

This week at work, I ran into a fun issue with a colleague while running some unit tests that I wanted to document here.

I was writing some new unit tests for a legacy codebase that’s been around a long time. For the sake of portability, I decided to use an in-memory SQLite database. After writing the tests and finishing the other work for the ticket, I opened a PR and went about my other tasks.

My colleague pulled down my ticket for testing and ran the tests. What should have been a very simple test was failing! What was weirder, is that the failing action was happening during one of the database migrations. The error isn’t important but for the sake of completeness, but it was a foreign key constraint error on the database.

After trying to debug stuff with them, we also noticed that his local database had been entirely nuked. Considering the test was running with the RefreshDatabase trait, I was starting to get an idea of what was going wrong.

What happened?

The empty database was the big clue that led me down the rabbit hole. I knew that running the database migrations is a slightly different process for in-memory vs. a real database, so I dug into that trait first.

This method in the call stack stuck out to me:

protected function usingInMemoryDatabase()
{
	return config('database.connections')[
		config('database.default')
	]['database'] == ':memory:';
}

If the config value is for the default database is :memory: then this would return true. Since PHPUnit overrides your .env when the tests are ran, this should be the correct value. Dumping it however, we get a different result.

For the config() function to be returning a different result than what is in a configuration file already, that must mean a config cache file exists. You can read more about how Laravel potentially includes that file here.

In our case, there was a config cache present, so when Laravel was booting to run the tests, that config was being used instead of PHPUnit’s environment variables.

To tie it back to the error we were seeing, since the local MySQL database was being used instead of the in-memory database, a migration with some outdated seeding code was throwing the foreign key constraint error. This is also a good reminder that MySQL and SQLite are not the same! If you are expecting foreign key constraint failures to be enforced in your tests, use the appropriate database for both your test suite, and when running your application.

The Solution

If this case, I decided it was best to try and fix the problem for the user running the tests. I modified the base TestCase.php as follows.

<?php

namespace Tests;

use Illuminate\Support\Facades\Hash;
use Illuminate\Contracts\Console\Kernel;
use Illuminate\Contracts\Foundation\Application;
use Illuminate\Foundation\Testing\TestCase as BaseTestCase;

abstract class TestCase extends BaseTestCase
{
    /**
     * Creates the application.
     *
     * @return \Illuminate\Foundation\Application
     */
    public function createApplication()
    {
        $app = require __DIR__ . '/../bootstrap/app.php';

        $app->make(Kernel::class)->bootstrap();

        Hash::driver('bcrypt')->setRounds(4);

        $this->clearCache($app); // Added this line.

        return $app;
    }

    protected function clearCache(Application $app)
    {
        // We don't have a cached config, so continue running the test suite.
        if (!$app->configurationIsCached()) {
            return;
        }

        $commands = ['clear-compiled', 'cache:clear', 'view:clear', 'config:clear', 'route:clear'];
        foreach ($commands as $command) {
            \Illuminate\Support\Facades\Artisan::call($command);
        }
        // Since the config is already loaded in memory at this point, 
        // we need to bail so refresh migrations are not ran on our
        // local database.
        throw new \Exception('Your configuration values were cached and have now been cleared. Please rerun the test suite.');
    }
}

If no config cache file is found, then we’re golden! Continue running the test suite. If we do find a config file, we run a few Artisan commands to clear all of those files and then throw an Exception. The exception is necessary since the config is already loaded into memory at this point. If the tests continued to run, all of the requested config values would be the old ones. While it may be possible to reset the config and continue running the tests, the utility of that is miniscule. Instead, we inform the user of the action that was taken and ask them to run the test suite again.

In an ideal world, you would never have a configuration cached locally as it’s main purpose is to speed up expensive disk operations in production, but alas, we don’t live in an ideal world! 😅


I want to give a shoutout to Joel Clermont for helping me debug what was going on and being a great rubber 🦆!

How to forward catch-all email after canceling Google Apps subscription

I am doing some digital house cleaning and I’ve wanted to cancel a Google apps business account for a while now but the ease of using the catch all domain names has been great and I didn’t want to give that up. Plus the interface for Gmail isn’t that bad.

I know there are paid services out there that offer similar functionality and I was ready to buy one until I came across this!

Enter https://forwardemail.net/ 🎉

It’s an open source and free email forwarding service that can handle catch all addresses. Setup is simple. Add some MX records, then add a TXT record with a forwarding address. This does make your forwarding address public but since I use Hey I have pretty good control on who is allowed to email me thanks to The Screener.

Just in case, I did change the few accounts still using those custom domains, but I’ve tested it and so far within about 20 minutes things seem to be propagating nicely. I’ll be keeping an eye on it over the next few weeks but an hour in and I’m pretty happy!

Check it out!

Health update – March 2021

Been a hot minute since my last update! To those of you that reached out to check on my progress, I am very appreciative and touched that you take the time out of your day to think of me. It’s encouraging to my soul. ♥️

Since my last update in January, things have been pretty a-ok. I’ve been seeing a therapist since that time and while I don’t feel that I’ve come to any sort of dazzling revelation, I keep scheduling appointments at the end of every session. I think part of what I’m latching onto is the accountability of talking to someone. I pay this person $100+ a month to listen to me give progress reports of my mood and feelings, and my physical health obviously plays a big role in that. Maybe I would do just as well with one of those gambling apps where you put down some money to accomplish some task and if you fail, they don’t give it back?

Exercise

My Apple Fitness workout history

As you can see from the picture here, I’ve gotten into working out a little since the end last week of January. 95% of the time, this is riding the stationary bike in my building’s gym, but sometimes I’ll go for a long walk up and down the parking garages’ surrounding my house (it’s been cold though!). It’s light exercise by any standard; very low impact, and I usually stop soon after my exercise ring closes; about 30 minutes. In that time, the bike says I travelled 6-8 miles, depending on my pace.

The biggest break was the week of February 15, where Texas had the snow-pocalypse that cut power to millions of people and dozens died of exposure, indoors. While I won’t go into the politics of it all, needless to say that week was weird. My partner and I went to my parents house on Monday night and were there most of the week. My parents don’t have a gym and I didn’t want to go trek in the snow just for a workout so I stayed inside.

I can’t say that the workouts are providing a lot of physical benefit. My weight hasn’t dropped drastically (as will be evident soon) but I cannot definitely say there is no benefit at all. This past week I’ve been working out later in the evening, around 10-11pm, and largely have been motivated to that thanks to the support from some colleagues at work. A few of us have Apple Watches and getting the notification that someone closed all of their rings for the day is a surprisingly effective motivator. The fact that I am responding to that pressure in a positive way is probably a good thing. I can easily think of times past when I would have laughed at those notifications and went back to watching TV and eating an entire bag of Doritos (…now I want Doritos…).

As the weather begins to warm up here in Texas, I want to try taking my rides outdoors as I have acquired an actual bicycle from a friend. I’m interested to see how that goes over the next few months.

Diet

My diet has been generally the same as before and I’ve done a decent job of sticking to my targeted macros. In the past two weeks, I’ve fallen back into my Wingstop habit, which is not sustainable calorically or financially, but the dopamine hit has been undeniable at times.

My go-to dinner has become a large salad with 4-5oz of greens, an ounce or two of goat cheese, fresh strawberries, and a handful of nuts with some sort of low carb dressing. I pair that with some protein, usually chicken of some variety, about 6-10 oz if I had to guess. Chicken thighs are my favorite though.

When I do get Wingstop, I’ve now transitioned to only ordering chicken (without my usual large side of fries). That alone is a reduction of close to 1000 calories and 100 carbs when y0u factor in the ranch dipping sauce. If I have the ingredients available at home, I will first make a salad like I described above, eat that, then eat the wings. The greens soak up a lot of the grease and upsets my stomach less, while also helping me feel fuller. I noticed this last night as I finished eating around 8:30pm and didn’t feel compelled to eat more the rest of the night.

At 10pm, I noticed the time and thought I wanted to get a snack, but managed to talk myself into drinking a big glass of water instead. My usual after dinner snack can be 500-800 calories of roasted nuts so great calorie savings there.

I still have a weakness for snacks, which I tend to eat after dinner but I think I’m doing better. For instance, I’ve started eating fresh strawberries as a snack which is surprisingly less bad than I originally thought. A pound contains 150 calories and 22g of carbs with about 8g of fiber. Those aren’t keto macros, but I can often feel stuffed after finishing a container. Satiety is my biggest goal when eating so that’s fine by me.

I’m still taken with how my tastebuds have changed in reaction to my decreased sugar intake. The strawberries now taste so sweet to me, it makes one part of my brain think that there’s no way they’re healthy, but they are! A few weeks ago I got a craving for a shake from a fast-food joint and it was insanely sweet. I still drank it, but I honestly haven’t had the craving since and writing about it now, I am unfazed.

If I had to rate my diet since the last entry to this series, I’d probably give it a solid 7/10. Not perfect, but not a total failure either. C+

Weight Loss

The graph tells most of the story, but I’ll elaborate. I’m not sure if I put numbers out there, but then end of 2020 was rough. As you can see, I gained 20 pounds in about 2-3 months, despite intermittent fasting 16+ hours a day during that entire time.

Since January, I’ve been back to my 5 meals a week eating schedule and making progress in a positive direction. At the end of January, I purchased a new bathroom scale which has been encouraging and like I mentioned earlier, I’m now also working out every day. My body fat percentage has decreased by about 4% in that time, which feels good to me. I’m currently just under 38% body fat, which is still a lot, but it’s miles better than the high 40s I was rocking in 2018.

February was basically flat in terms of weight loss for the entire month, with some ups around when the storm hit. Considering one night that week I ate an entire large pizza by myself, probably not the worst outcome! March has been better, though we’re only 6 days in.

Tonight is my dad’s birthday and we’re going to an all-you-can-eat steakhouse and I’ve already resigned myself to indulging on some mashed potatoes and likely a full pound of meat, but honestly, it’s probably not that big of a deal. This month, I’d like to try and focus on mindfulness when it comes to snacking as that’s where I believe the bulk of my hindering calories come from. The roasted nuts especially.

If I can get under 250 by the end of the month without immediately fluctuating above that number in April, I’ll be really happy I think. That’s 1 pound a week which should be totally doable and sustainable. If I can keep that up for a year, I’ll be entering 1-derland in time for Summer 2022. That’s exciting!

I think I mentioned this before but I’ll say it again because I’m thinking about it. Winter 2020 was huge bummer for me because I had done so well in 2019. I bought new clothes that I was excited to wear, especially the following winter, that I ended up not being able to wear comfortably at all without risking damaging them. That fucking sucked. If I have one goal for 2021, it’s to be able to wear my new coat comfortably all through next winter. No bulging buttons or getting too hot that I need to take it off after less than 5 minutes indoors.

Final Thoughts

Thank you to everyone who has supported me over the past few months. Thank you to my wife who doesn’t complain when I suggest Wingstop every time a discussion about dinner comes up. Thank you to my parents have been supportive of my lifestyle changes and don’t pressure me to eat when I don’t want to. I know there are a lot of people out there who have differing views on fasting for weight loss, and my thoughts are with them as we all navigate this rocky journey to better health together.

If you want to chat about how your life is going, food related or not, hit me up on Twitter. My DMs are open. Talkin’ to new people online is really fun for me and I’ve been told I’m an excellent sounding board.

✌️


Photo by Dovile Ramoskaite on Unsplash

Installing Laravel Spark Manually with Composer – 2021

For whatever reason, you may need to install the new Laravel Spark into your project without following their installation instructions. Sometimes you don’t control the server environment as much or devops, etc so it’s just easier to include the files in your project. Here’s how to tell Composer where the files are and how to load Spark.

First – Update composer.json

Similarly to the official install docs, you need to add a snippet to your composer.json file.

"repositories": [
   {
     "type": "path",
     "url": "./spark-stripe"
   }
],

In my case, I have a folder in the root of my project called spark-stripe in which I placed all of the package files. Does it matter what the name is? I don’t know but since that is the package name, it made the most sense to me.

Final – Install the Package

Lastly, you’ll need to install the package like you normally would. For what it’s worth, I’m using Composer v2.

composer require laravel/spark-stripe 

If you don’t see a new list of dependency packages being installed, you likely did something wrong. Go back and check your composer.json file for spelling errors or other mistakes.

Hope this helps you out!

How to add a Macro to the Laravel HTTP client facade

While working on Let Them Eat 🍰I came across some peculiar behavior. For some of the Slack web API’s, there is a requirement to send the requests as a URL encoded form object. Since all of their API endpoints support this method of access, I created a little helper on my user object to get an instance of the Laravel HTTP client, with the asForm() method already applied.

This was working great until today when I wanted to add support for blocks to one of my bot responses. While I could support sending that field as a JSON string, it felt better to change the request to be sent fully as JSON instead. I thought, that would be as simple as calling ->withHeaders() again, but unfortunately, the deep-recursive-merge used by the HTTP facade doesn’t clear out any existing values.

Http::asForm()->asJson()

"headers" => [
  "Content-Type" => [
    0 => "application/x-www-form-urlencoded",
    1 => "application/json"
  ]
]

Obviously, the best option would be to not call the facade with multiple methods that change the same object, but in this case I really wanted to just overwrite it for this one instance.

Enter, Macros!

There are lots of places online to read about Laravel Macros so I won’t go into it here too deeply, but the gist of it, is you can add custom methods to core objects without extending them and creating new classes. This can be super helpful when you just want to add a little helper method but don’t want to go through the process of extending the class the old-fashion way. This can be especially useful to access protected properties or private methods.

I knew all facades in Laravel are macroable so I jumped into my AppServiceProvider boot method and added a lil somthin somethin.

// within AppServiceProvider::boot()
PendingRequest::macro('clearContentType', function () {
  $this->options['headers']['Content-Type'] = [];
  return $this;
});

Originally, I tried to add the Macro directly to the Http facade, but that only ended up working if I called the new method statically. To have it work in the manner I expected, I had to add the macro to Illuminate\Http\Client\PendingRequest, which is the class that is bound to the Http facade under the hood.

Using my new macro, I can easily clear out any content type headers before making a request, no matter how many times I call methods that set the content type header.

Http::asForm()->asJson()->asForm()->asJson()->asForm()->clearContentType()->asJson()

"headers" => [
  "Content-Type" => [
    0 => "application/json"
  ]
]

Now, I suspect this is a bug, but I’m not sure. I’ll be opening an issue on Github and we shall see. In the meantime, the macro will have to do! 🙂

Running Multiple CLI Commands in One Terminal for Laravel Development

While working on Let Them Eat Cake 🍰 , I noticed when I was end-to-end testing the app locally, I need a few different things running at once:

  • Laravel Queue Worker – running queued jobs
  • Expose – Proxy for allowing external webhooks needed for my app
  • Stripe CLI – Passing Stripe webhooks to my app without using a proxy
  • Laravel Mix – Bundling static assets

Running each of these commands separately isn’t that big of a deal, but it can be a hassle to remember to start them all. This was very apparent yesterday as my computer decided it wanted to reboot every time it went to sleep! A short trip around town running errands with my wife and my computer restarted 3 times in a couple of hours; frustrating!

Today I decided to look into some potential solutions. I had a few basic requirements while I was looking around.

  1. It needs to be easy to start and modify for each new project – this also implies some flexibility. No sudo password required nonsense.
  2. I can stop all running processes with SIGINT (CTRL+C)
  3. All process output is still visible so I can see compile errors or bad requests.
  4. Process output is in one terminal window/tab (optional – but I wanted it!)

A little searching and some testing later and I think I’ve come to a nice solution that gives me plenty of control and flexibility.

One of the first tools I discovered was moreutils, a package you can install with brew that essentially adds a handful of helpful CLI utilities. The name is play on coreutils, a set of packages included in most Linux distros. Specifically, there is a package called parallels that seemed to be just what I was looking for. It has lots of extra features and can start lots of parallel processes and close them all at once. The reason I ultimately chose not to go this route, was in case I work on any of my projects with other developers. Given the simplicity of my use case compared to the feature set of the package, it was honestly overkill. This also means my project is that much more portable between machines and developers. I keep my own dotfiles, but if I can avoid complexity, I like to try that route first. 🙂

The Solution

Ultimately, I settled on a little shell script courtesy of this Ask Ubuntu thread. Here is my adaptation:

#!/bin/bash
sh ./startexpose &
P1=$!
php artisan queue:listen &
P2=$!
stripe listen --forward-to eat-cake.test/stripe/webhook &
P3=$!
wait $P1 $P2 $P3

A quick rundown. The first process is essentially another wrapper with some Expose configuration details. The output from this is a big tail-ing table of incoming requests. The second is my queue worker. I opt for queue:listen here so the worker will automatically restart itself when any code in my application changes. Output from this command is a line for every queued job that is processed. Third is the Stripe CLI, which needs to be configured beforehand, but is a once every 90 day thing. Seeing as it has access to my payment data, I’m okay with the hassle tradeoff.

I’m still debating on whether or not to include my Laravel Mix watch command in this, and I might add a runtime flag for it, but without trying it for any reasonable length of time yet, I’m guessing I’ll need to start/stop the watcher to do production compiles more often than anything else. The other commands can run mostly unmonitored unless I need to check something, but I tend to get a lot more feedback when I inevitably screw up the closing brackets in a JS file. 😬

Hope this helps you out! Feel free to ask any questions here or send me a tweet @DaronSpence. ✌️

Migrating Data and Merging Models in Laravel

On one of my side projects, Let Them Eat 🍰 , I recently needed to do some migrations to combine / merge two models. I originally had optimized a bit too much, and later realized things would be a lot simpler if I only had one model.

There wasn’t a ton of info about this online, so I’m going to do my best to try and explain what I did. For this migration, I was using Laravel 8.

So turns out, merging models is kind of a pain! In my case, I was merging a SlackUser::class model with the default User::class model that ships with Laravel. From a data perspective, it wasn’t too bad. I needed to add a few columns to the User table that were previously on the SlackUser table. The issues arose when I realized there were a lot of places in my code that were reliant on accessing each model off of a relationship of the other. So lots of calls to $user->slackUser and $slackUser->user intermingled across the app, dependent on what I was doing at the time.

Since my ultimate goal was to completely delete any reference to a SlackUser, I had to take a careful approach when modifying the database.

First, I added the extra columns I needed to the users table.

Schema::table('users', function (Blueprint $table) {
  $table->string('slack_user_id')->after('id')->nullable();
  $table->foreignId('team_id')->after('slack_user_id')->nullable()->constrained()->onDelete('cascade');
  $table->boolean('is_owner')->after('team_id')->default(false);
  $table->string('avatar_url')->after('email')->nullable();
  $table->string('timezone')->after('avatar_url')->nullable();
  $table->boolean('is_onboarded')->after('timezone')->default(false);
  $table->softDeletes();
});

I continued in much the same process for other tables that needed to be modified.

After the tables were modified, I queried all of the SlackUsers and looped over them to create new User models if they didn’t have one already. In my app, a User model was only created if the person logged into the webapp, otherwise they would happily live on as only a SlackUser. Now, everyone gets a user model, and I don’t really care if they log in or not!

Some advice on the Shifty Coders Slack recommended not to rely on Eloquent here. This makes sense as in a future release, I’ll be completely removing the SlackUser model, so if I relied on Eloquent for the migration, it would throw an error if I ever deleted that class. Here’s what the migration looked like:

DB::table('slack_users')->get()->map(
	function ($slackUser) {
		$userId = $slackUser->user_id;
		$slackUser = Arr::except((array) $slackUser, ['id', 'created_at', 'updated_at', 'user_id']);
		$user = User::findOrNew($userId);
		if (!$user->exists) {
			$user->name = $slackUser['slack_user_id'];
			$user->password = Hash::make(random_bytes(20));
		}
		$user->fill($slackUser);
		$user->save();
	}
);

This is all pretty straightforward. For each SlackUser, check if they have a User model already, and if not, create a new User and give them a random password. My app doesn’t actually use passwords for authentication, instead relying solely on Slack Oauth, so the password field is irrelevant. In the future I may want to allow for other auth methods, so I left it for the sake of simplicity.

After all of the users were migrated, I could then go about the business of updating other models that used the SlackUser as a foreign key. In my case, I had messed up in the original migrations and not enforced those foreign keys, but if they are enforced in your app, you’ll need to drop the foreign key before you go about migrating all of this data around.

$table->dropForeign(['slack_user_id']);

Here is what the migration to change the foreign keys on my Cake model looked like.

DB::table('cakes')->get()->map(
	function ($cake) {
		$giver = DB::table('slack_users')->select('slack_user_id')->where('id', '=', $cake->giver_id)->get()->first();
		$giver = User::where('slack_user_id', '=', $giver->slack_user_id)->withTrashed()->first();
		$target = DB::table('slack_users')->select('slack_user_id')->where('id', '=', $cake->target_id)->get()->first();
		$target = User::where('slack_user_id', '=', $target->slack_user_id)->withTrashed()->first();
		DB::table('cakes')->where('id', $cake->id)->update(['giver_id' => $giver->id, 'target_id' => $target->id]);
	}
);

Again, notice that I’m not using Eloquent to access the SlackUser model, instead relying on the DB:class facade. I’m free to delete the SlackUser::class at anytime now!

All of this code was added to the up() method of my migration, and I carefully reversed all of the column changes for the down() method. One thing I did not do in the down method, was remigrate any data. I figured if the deploy went so bad that I needed to do that, then I would be better off restoring the database entirely from a backup instead. The down() changes I made were purely so I could migrate up/down for tests, which weren’t reliant on any database values anyway.

That brings me to another pain point: tests! 90% of my tests had to be updated since they mostly relied on the SlackUser. I started with one test file at a time, running the entire file first, then each failing test. I generally changed any instance of SlackUser to User first and then saw what broke. The first few were painful as there were references to relationships that needed to be updated. Often times while doing this, I would catch something that needed to be updated in my migration as well.

Eventually, most of the methods from SlackUser were migrated to User and all of the relationships were updated. Views were the last thing to be checked and I even managed to add a few missing tests based on failures I found while manually browsing the site locally.

// One of my newly added tests!
/** @test */
public function a_user_can_view_the_perk_redemption_page() //phpcs:ignore
{
	$this->withoutExceptionHandling();
	$user = factory(User::class)->create([
	  'is_owner' => true
	]);

	// Make fake users for the manager selection.
	factory(User::class, 5)->create([
	  'team_id' => $user->team->id
	]);

	$this->actingAs($user);

	$user->team->perks()->create([
	  'title' => 'an image perk',
	  'cost' => 100,
	  'image_url' => 'https://perk-image.com/perk.jpg'
	]);

	$this->assertCount(1, Perk::all());

	$res = $this->get(route('redeem-perk-create', Perk::first()));
	$res->assertOk();

	// Check for all of the names on the page. (manager selection for perk redemption)
	$res->assertSee([...User::all()->map->name]);
}

In the end, the Github PR had 57 changed files! A huge undertaking by any standard. I would venture a guess that those 57 files represent 80-90% of all of the code I had written for the app.

Overall, I’m happy I did this, as the logic surrounding users is much simpler to understand. I also got the opportunity to try out some new things and learn a bit more about the built in DB facade. I’m still kinda intimidated by SQL in general, but I’m getting more courageous every time I tackle one of these projects. Backups are still really important though! 😉

Conclusion

If you have any questions, feel free to reach out to me on Twitter or leave a comment!

January Update Pt 2: Fasting Boogaloo

Well, here we are a week later and I have to admit I feel better. I’ve overloaded on fasting propaganda from across the internet, though primarily YouTube and Reddit.

I’ve gone back and listened to a few lectures by Dr. Jason Fung. His “The Aetiology of Obesity” series has been my sleeping playlist for a few nights now. It’s a 6 part lecture series, each lecture running about an hour, discussing theories on why humans become obese and the different factors that can contribute. Overall just interesting stuff, plus he has a pleasant voice (IMO) and he’s very easy to digest (heh).

Practically, not much has changed re: my diet compared to earlier fasting cycles. Tuesday, I broke a 44~ hour fast with a burger patty topped with a fried egg, a half pound of salad with strawberries, feta, walnuts, and a few spoonfuls of a ranch-like dressing. I followed that up with some popcorn and a few slices of bacon.

Last night I ate again, this time getting lured into a chicken bowl with cauliflower rice from Chipotle. I walked over there and back home, proceeding to pile the contents of my bowl onto some low carb tortillas I already had at home. I followed that up with the rest of the strawberries, some more popcorn, and 3-4oz of pistachios. I drank 3 cans of sparkling water with dinner and by 9pm, I felt pretty full so decided to start my fasting timer an hour early.

Now I begin another 40+ hours until dinner on Friday evening. Sleep has been pretty meh, but that’s to be expected while fasting. I was tired earlier and managed to be in bed by 11pm, which is rare for me. I woke up around 6~ this morning and got out of bed around 7:30 to get some coffee and write this post.

I have a therapy appointment this afternoon, the second with my new therapist, and I’m tentatively looking forward to it. If the weather isn’t too bad I might try and walk over there.

Overall, this week has been a step in the right direction and I’m optimistic it will continue to trend in that way.

One thing I do want is to find some more fasting support from local friends. I don’t know what we’d talk about but it feels intuitively easier to stick with something when there is accountability, even if it’s unspoken.

My dad is kinda doing IF right now with his church’s new year resolution/diet/fast. He’s had good results so far but I’m not sure how he feels about continuing it after his church finishes the fasting period. I’ll text him later today and see how things are going.

As always, feel free to tweet me @daronspence or leave a comment if you have questions.

✌️

Back on the horse, maybe? January 2021

Fuck, 2020 felt like a dumpster fire to a lot of people in the world, and that means me too!

Note: please excuse me if it feels like I’m rambling. It’s late and I’m typing this on a keyboard I’m not really used to so it takes longer to get my thoughts out with extra editing.

I’m just so disappointed in myself. I know it’s late as of the time of this writing, but I think that might just point towards something I feel is true. I know what I need to do, but overcoming that split second instinct to run towards the comfortable is obviously hard to ignore.

Just now, I thought it might be a good idea to go to Whataburger just to get some unsweet tea, since they’re the only place open late with anything close to fresh tea. But I know I’ll be tempted to order something else in that drive thru line. It’s too easy and I’m looking for any excuse.

In 2019, my wife bought me a coat on a random trip to the mall. It’s a very nice peacoat that cost her a lot of money, but she knew how much I missed my old one and she figured I deserved it for whatever reason. I really like it and I got a lot of compliments on how good I looked when I wore it out and about, especially with people I had not seen in a year or two.

It finally got cold enough this past month to warrant wearing a coat outside more. I got up a 6am to get ready to volunteer at church, when I realized it would be an excellent day to wear the pea coat. I was devastated when it didn’t fit. I almost wore it to prove to myself it was possible. I could have stretched it out and done up the buttons, but I knew deep down in that instant, that I was lying to myself and it wasn’t worth it. I reached over a few hangers and grabbed the biggest coat I owned, which is also much to insulated for indoor wearing, and zipped it up. I distinctly remember my hand passing over the 4 other jackets in my closet and settling on this one. It’s much to big, but it’s comfortable and it hides my imperfections. It hides the overeating I’ve done for months.

But it can’t really hide me from myself…

So tonight, I’m writing something. Maybe it’s more empty promises. I’m not sure. I’ve saved a few of my favorite salad restaurants in a Google map, hopeful that I’ll turn to them in my next moment of weakness, but I’m not sure I’ve convinced myself yet.

I think I’ve barely convinced myself this blog is worth posting.