"We're gonna run what I call a Kyle drill."
A man wearing sunglasses and carrying an assault rifle talks his way through a training circuit he's built at a gun range, showcased in a YouTube video.
The course lets participants recreate the moment Kyle Rittenhouse shot three protesters in Kenosha earlier this year, killing two of them.
"This is the simulated mob," the man says.
"You're going to sit down and take a shot at the skater. I don't know how many shots Kyle took, but Kyle's a badass. So we're going to assume one shot, one kill."
The skater he is referring to is Anthony Huber. He was shot in the heart and killed by Kyle Rittenhouse.
Rittenhouse, who was 17 at the time, turned up at a protest in Kenosha after Jacob Blake was shot by police.
He was carrying an assault rifle and said he was there to protect property, claiming he acted in self-defence when opening fire. He is awaiting trial for double murder.
This piece isn't about the shooting itself, rather what it tells us about YouTube and its policies on extremism.
The Kyle Drill video is just one of dozens of disturbing uploads we found on YouTube venerating Rittenhouse.
Other social media companies like Facebook have tight rules on what you can and can't say or show about Rittenhouse. Facebook, for example, has banned his name from being searched for.
On YouTube though, there are no such rules.
'YouTube has fallen behind'
"Facebook and Twitter have taken much more concerted action against content supporting Rittenhouse," says Chloe Colliver from the Institute of Strategic Dialogue.
"YouTube has fallen behind other social media companies in the US this year in its efforts to deal with extremist content and disinformation."
That last sentence is one I've heard many times covering extremism on social media this year - that YouTube has a moderation problem.
The company has a set of rules that "prohibit any violent or graphic content intended to shock viewers".
"We take swift action to remove content flagged by our community that violates those policies," YouTube told the BBC.
The glorification of Rittenhouse on YouTube, however, suggests community flagging simply isn't working.
"Kyle Rittenhouse is an inspiration to me" is the first line of one YouTube video we found. The man is holding a gun.
So we decided to show YouTube a handful of these Rittenhouse videos for comment.
First, that Kyle drill video.
This video was deemed by YouTube to have broken the platform's rules on glorifying violence, and has now been removed.
However, another Rittenhouse training video, where a group of men are doing something almost identical - recreating Rittenhouse's shooting at a gun range - was not taken down.
Instead, YouTube decided to put an age restriction on it.
It's hard to see why one video has been taken down and another left up. We weren't given a response when we asked YouTube for clarification.
Next, a video we found showing Rittenhouse as a video game character on a platform game. He runs around shooting protesters and picking up ammunition.
At the end, the final "boss'" is Alex Huber with his skateboard. The character shoots him, completing the level.
After showing this to YouTube the company once again deemed it unacceptable, and took it down for breaching its rules on glorifying violence.
But with a video showing how to set up your gun like Rittenhouse, YouTube did not act on it or give it an age restriction.
Next, a song called The Kenosha Kid. This is a ballad about Rittenhouse, marking him as a hero - making a stand against unruly protesters. This video was deemed to break YouTube's rules and was banned.
But many other videos using exactly the same song have not been.
Once again, the distinction between videos that are acceptable or unacceptable is hard to understand.
Many of the comments in these videos call Rittenhouse a hero. Others express surprise that they are allowed on YouTube.
Of course, YouTube is also a great place to monetise content.
Profiting from Rittenhouse
The "merch shelf" is a way of selling merchandise beneath a YouTube video. We found pro-Rittenhouse videos where "Free Kyle" T-shirts were being sold.
Should people be able to profit from the Rittenhouse killings?
Well, YouTube said no - they took the ads down after we highlighted them.
"Upon review, we have taken action and removed the ads", it said.
The rules, though, appear to be slightly arbitrary - as is often the case with extremist material on the platforms of Big Tech.
YouTube acted quickly once the BBC had notified them of these videos. But it took a journalist to show them that this content was being sold on their own platform.
"This is another instance in what is becoming a trend where YouTube lags behind other platforms" says Angelo Carusone from Media Matters.
"Yes, some pro-Rittenhouse content still gets through on Facebook, but the reason why YouTube is currently a safe haven and engine of Rittenhouse content is because they simply haven't even bothered to grapple with it at all."
Having gone through this process with YouTube, it's unclear why some videos are left up while others are taken down.
And in many ways, YouTube's inconsistent policy on Rittenhouse illuminates a larger problem it has with extremism more generally.
There's simply a lot of stuff on the platform that is right on the edge - and YouTube is struggling to define where the line is.
James Clayton is the BBC's North America technology reporter based in San Francisco. Follow him on Twitter @jamesclayton5