More than a week after YouTube star Logan Paul caused controversy for posting a now-deleted video in which he apparently discovered the body of a man who had committed suicide—which prompted two separate apologies from Paul himself, as well as countless response videos condemning the original (many of which were allegedly removed by YouTube)—the Google subsidiary has finally commented about the whole affair, in a series of tweets posted earlier this week.

“Many of you have been frustrated with our lack of communication recently. You’re right to be,” the company led off in its first tweet. The statement—which went unsigned and unattributed—explained that the company’s silence was in part due to a desire to listen to criticism. The company, it said, “expect[s] more of the creators who build their community on @YouTube, as we’re sure you do, too,” adding that the original video “violated our community guidelines, we acted accordingly, and we are looking at further consequences … We know that the actions of one creator can affect the entire community, so we’ll have more to share soon on steps we’re taking to ensure a video like this is never circulated again.”

What steps the platform will take are unclear, as are what it meant that YouTube claimed that it “acted accordingly,” considering that it was Paul himself who removed the offending video, and also stepped away from social media when things got too hot. (Paul’s channel is still accessible, so it’s not as if YouTube closed it down in response.)

Whatever the outcome, it’s becoming clear that YouTube is going to face increasing pressure this year to take more of a role in monitoring—and, perhaps, censoring—the original content being posted on its platform. At the same time that the company was addressing the week-old Logan Paul story, social media was abuzz with another YouTube personality Shane Dawson’s resurfaced comments about pedophilia, a story that quickly broke through to mainstream media. One of the reasons that story had legs, of course, was that YouTube has recently had to deal with problematic content targeted at kids.

That YouTube will likely be forced to curate content more aggressively shouldn’t come as a surprise to the company; social media networks like Facebook and Twitter have been criticized for some time now about their lack of desire to do the same thing, and the potential for manipulation and abuse of—and through—their systems. YouTube, which similarly prides itself as a gateway to content created by users, faces the same problem, albeit in a slightly different form.

After all, YouTube’s top content creators are able to monetize their content in a way that Twitter and Facebook stars can only dream of, which can complicate matters—as does the fact that the millions of viewers who follow the embattled YouTube channels can turn (and, indeed, have turned) on those critical of their favorites and become mobilized online armies of trolls in defense, they say, of creative freedom. (This doesn’t always happen, of course; last year, audiences turned away in large numbers from Jon “JonTron” Jafari when he started making racist comments.)

What, then, can YouTube do? The company just last month announced new guidelines on content, and seems set to do so again in the wake of the Logan Paul video. But guidelines are clearly not enough, as can be seen by the Paul video seemingly having been reviewed and approved for content before the controversy broke into the mainstream media. In order to avoid future scandal and upset, YouTube may be forced to do the one thing it has tried to avoid since inception—and start treating its original content with the responsibility and curation that traditional television networks have from day one.