Jump to content

Randumb Thoughts

  • entries
    132
  • comments
    821
  • views
    14,629

Free At Last, Part 2


crazydaffodil

1,213 views

Some days, you just don't wanna...  Sometimes, it involves wearing underwear.  There's just that one day where we say we're going to fly free and be as we are without restrictions.  Ummmm, sometimes ya gotta rethink that idea before you decide to do something else!

commando.jpggoing commando.png

 

Just sayin...

  • Upvote 1

1 Comment


Recommended Comments

WhatWouldJohnCrichtonDo?

Posted

I should remember not to read your blog before breakfast. Eeewww!!!

  • Upvote 2
Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Posts

    • On 4/23/2024 at 4:41 PM, Antimony said:

      A little aside, but one thing I've wondered about since Josh's case is, well, the muck behind Covenant Eyes.

      CE takes screenshots of any activity that it deems "suspicious"* and then blurs them (in a way they claim cannot be unblurred, but I'm doubtful). In any case, it appears they do this to protect the "monitoring allies" from seeing all the porn their husbands are watching or whatever, but. 

      But. This means that CE has to put that screenshot in its database, blur (and then encrypt it, they claim), and store it somewhere for the report for the accountability ally user. Which is...okay. But I've always wondered, does this mean there is actually a database that includes CSAM sitting on the CE servers? 

      They seem to allude to this being a possibility in their End User Agreement.

      I don't think we'll ever see this kind of thing in court (in any capacity) but it has struck me as strange since Josh's case that this private company is distributing (blurred) screenshots of unknown material and is also designed specifically to store all these images into their own databases and how do you even...legally manage such a database and...? And how do they know that content is illegal? Are they monitoring it to just terminate it and...? Not report? Do they have any (legal) duty to report? I don't think so. Do they have any ability to report this? (They will give over your info when requests by a court, but that's standard.)

      It's such a strange app to me and its always seemed like you could get a big can of worms about it when it comes to cases like Josh. 

      *Tumblr, a few years ago, used a similar early AI to detect "adult" content but ended up banning a lot of pictures of sand dunes because if you are a robot, the curves of a sand dune look roughly the same as those of a pale-to-tan naked lady because...that's a robot and it doesn't know. 

      If I was implementing this, I would be blurring before putting in the database. Now I’m also very doubtful that it can’t be unblurred, at least enough to reconstruct recognizable things, but blurring is also likely to save space, so even from a database and systems design perspective, I would be surprised if they saved the original images somewhere permanently, also due to legal liability. 

    • Melissa1977

      Posted

      I don't think it's a genuine MCM. I think it's an imitation. A $6000 piece of forniture in perfect shape is not sold for $500. Yes, sometimes this happens, but I suspect Abbie is lying to appear as the smartest buyer. She is noot overspending, she is nooot spoiled, she is investing!

    • Xan

      Posted

      8 hours ago, neuroticcat said:

      She talked about how the little children were taking bread from the pantry and putting it in there. So maybe it was clean when it came home but then the food got in it? 

      I think she just lied.  She didn't want to admit that she hadn't even opened the cabinet and cleaned it out so she blamed it on the boys.  They'd have no reason to put bread in there and, honestly, it didn't look like bread crumbs.  Abbie was so excited over her latest ugly find that it hadn't occurred to her to clean it up first.

      • Upvote 3
    • JermajestyDuggar

      Posted

      I think the little boys just get up to lots of shenanigans because Braggie refuses to supervise them. 

      • Upvote 1
    • JermajestyDuggar

      Posted

      8 hours ago, SassyPants said:

      I think with all of Josh’s issues, the families still in AR, for the most part, have hunkered down and are protecting their own family’s privacy. The less they expose themselves, the less pushback and snark they receive.

      Which of course is a smart move in my opinion. 

      • Upvote 2
      • I Agree 2


×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.