Jump to content

Baby Thor

  • entries
    104
  • comments
    791
  • views
    17,154

Thorsday!!


Imrlgoddess

883 views

So. He's adorable. Aaaannnddd he's a total slut. :my_biggrin:

He really enjoyed Little Hooman's origami paper. She'd made these cute boxes & he felt the need to relieve her of one of them.  She's wasn't happy about it, but it was hilarious to watch the struggle! 

20161005_185410.jpg

20161004_211139.jpg

20161004_211109.jpg

20161003_203342.jpg

  • Upvote 12

3 Comments


Recommended Comments

catlady

Posted

And he's sleeping so nicely after stealing the origami box. That must have been an exhausting task!

  • Upvote 3
Link to comment
Imrlgoddess

Posted

The sleepy picture was taken a day or two before, I didn't snag him passed out on her legs...he played and played and then passed out on her :my_biggrin:

  • Upvote 2
Link to comment
feministxtian

Posted

He's gotten so BIG!!! He's not a little handful anymore!!!!! He's still about the CUTEST thing though (second only to my 2). 

Hugs and skritches from us. 

  • Upvote 4
Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Posts

    • Antimony

      Posted

      52 minutes ago, tanba said:

      If I was implementing this, I would be blurring before putting in the database. Now I’m also very doubtful that it can’t be unblurred, at least enough to reconstruct recognizable things, but blurring is also likely to save space, so even from a database and systems design perspective, I would be surprised if they saved the original images somewhere permanently, also due to legal liability. 

      There was one case with a pedophile (given the name "Mr Swirl" in the news) who had posted a manipulated image of his own face online and it took like...one guy with Photoshop who was like, "Uh, yea, that can be undone."

      Some reports suggest it took a little too long to find this one guy and perhaps the confidence of said pedophile made them thing the manipulation algorithm was more than it was but it was really...literally like the spiral twist function.

      I am mostly doubtful CE owns or bought any sort of interesting blurring algorithm. If we can pick past the diffraction of light with STORM...well. It just struck me as such an outlandish claim. 

    • JermajestyDuggar

      Posted

    • HeartsAFundie

      Posted (edited)

      I saw Michelle standing next to Jill in one of the pre-balloon release photos.  Also spotted a front view of Israel standing next to Derick. 

      Izzy, Sam and Freddy all wore pink polos and tan pants, presumably for Isla which I thought was a very nice touch. 

      Edited by HeartsAFundie
    • viii

      Posted

      9 hours ago, dawn9476 said:

      People on Reddit think Joy wasn't there, which is understandable since she went through something similar. Going to something like that would probably just bring all that pain back up.

      I don't think I'd find it understandable if it was me. I'd side-eye my sister if she didn't attend my child's funeral, especially knowing how deep the grief is. I would hope that my sister would be there for me, since she would be able to relate the most. 

      I understand how some trauma can be triggering, but I also think there comes a point where we need to be there for loved ones. If Joy had just lost her daughter within the last year, that might have been a different story. 

      It's a fine line between protecting yourself and supporting loved ones. 

    • tanba

      Posted

      On 4/23/2024 at 4:41 PM, Antimony said:

      A little aside, but one thing I've wondered about since Josh's case is, well, the muck behind Covenant Eyes.

      CE takes screenshots of any activity that it deems "suspicious"* and then blurs them (in a way they claim cannot be unblurred, but I'm doubtful). In any case, it appears they do this to protect the "monitoring allies" from seeing all the porn their husbands are watching or whatever, but. 

      But. This means that CE has to put that screenshot in its database, blur (and then encrypt it, they claim), and store it somewhere for the report for the accountability ally user. Which is...okay. But I've always wondered, does this mean there is actually a database that includes CSAM sitting on the CE servers? 

      They seem to allude to this being a possibility in their End User Agreement.

      I don't think we'll ever see this kind of thing in court (in any capacity) but it has struck me as strange since Josh's case that this private company is distributing (blurred) screenshots of unknown material and is also designed specifically to store all these images into their own databases and how do you even...legally manage such a database and...? And how do they know that content is illegal? Are they monitoring it to just terminate it and...? Not report? Do they have any (legal) duty to report? I don't think so. Do they have any ability to report this? (They will give over your info when requests by a court, but that's standard.)

      It's such a strange app to me and its always seemed like you could get a big can of worms about it when it comes to cases like Josh. 

      *Tumblr, a few years ago, used a similar early AI to detect "adult" content but ended up banning a lot of pictures of sand dunes because if you are a robot, the curves of a sand dune look roughly the same as those of a pale-to-tan naked lady because...that's a robot and it doesn't know. 

      If I was implementing this, I would be blurring before putting in the database. Now I’m also very doubtful that it can’t be unblurred, at least enough to reconstruct recognizable things, but blurring is also likely to save space, so even from a database and systems design perspective, I would be surprised if they saved the original images somewhere permanently, also due to legal liability. 

      • Upvote 1


×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.