TLDR: I uploaded a video to youtube but youtube’s trained AI script kept avoiding picking a thumbnail I emebeded in the video no matter all tricks and changes I made to it. prompting me to deleted the video instead of sharing my phone number for the privilege of uploading a custom thumbnail.

First time trying to upload anything to youtube I knew that my effort will be wasted but I went with it none-the-less. I had spent two days recording a tutorial and hopping for a small view count, now I must say that I have dealt with google services in the past and my disappointment with them was unwatchable so I kinda had a previous taste of that.

I uploaded the video and worked on the thumbnail to only find out that I can’t add the thumbnail without adding my phone number to the account. I was hesitant considering I care about my privacy but also wasn’t sure if it was worth it, considering I didn’t have high hopes for the channel. so I was given the choice of three thumbnails, one screenshot at the beginning one at the middle and the last one at the end of the video.

so I thought why no take advantage of that and re-encode the video with the thumbnail at the start and have youtube’s script pick that. guest what ? I re-rendered the video six times changing the placement and length of display of the thumbnail at the beginning of the video. trying to animate the elements of the thumbnail and all sorts of tricks and nothing worked, youtube kept changing the time at which they picked the thumbnail and kept moving it further to the middle of the video all of this just so you can never post a video thumbnail without sharing your phone number.

I knew they were vicious but not to the point of allocating resources and training AI on these sorts of shenanigans. I miss those times when we had workarounds for every thing.

  • @ExtremeDullard
    link
    English
    81 month ago

    I hate Google as much as the next guy - well, probably a lot more than the next guy actually - but here I’m siding with them for a change.

    They require payment for a feature (your phone number: it is monetizable private data to them and that’s your payment) and you tried to get the feature without paying. And you failed.

    The story here is that AI is frighteningly accurate when detecting embedded screenshots, not that Google is “vicious”: they’re not vicious in this case, they’re simply scary successful at detecting your attempts to game their system. Probably because everybody and their dog tries the same trick all the time, I would assume.

    Generally speaking, I agree with your assessment of Big Data and Google. But not in this particular instance.