Age Verification
This website contains age-restricted material including nudity and explicit content. By entering, you confirm being at least 18 years old or the age of majority in the jurisdiction you are accessing the website from.
I am 18+ or older - Enter
I am under 18 - Exit
Our parental controls page explains how you can easily block access to this site.

Discussions for Scenes for Version 1.2.X Fullscreen Mode here

  Fórum / Tudo sobre iStripper

Wyldanimal
MODERADOR
De em Mar 2008

4175 post(s)
Faz 1 day (edited)
Can you tell us more about your 2D to 3D conversion app ?

a little background 1st..
For many years I've wanted a software version of what the LG 3D TV's have in their Firmware.
Real time 2D to 3D conversion.

have a look Here:
https://iwantaholodeck.com/stream-to-3d-configuration/
and Here
https://iwantaholodeck.gumroad.com/l/wnclr

I thought I found the golden Ticket...
I bought in...
I was totally Disappointed, It's not 3D, it's a 2D screen Floating

So I decided to Dive in and see what I could come up with..
After a few hours I had a pretty good working Live 2D to 3D app.

I've tried to package it into a single Downloadable Executable file.
But so far everything I have tried Breaks it..

I have it running on my system using my GPU's CUDA cores for all the Heavy processing.
but my GPU is aged, an RTX 3060, and I'm not running the Latest Drivers.

Every time I bundle it up, the resulting executable Defaults to CPU only
and that runs at about 2 Seconds per Frame, or 30 Frames a minute.

With my GPU, I get a pretty Steady 24fps Live 2D to 3D conversion.

I do a Find edges with High Contrast.
then I normalize the Edges to White

I do a Gradient Flood fill on the Inside shape of the Edges.
Depending where on the Screen the Shape is Found, the Gradient shift is Darkened to account for Depth.
Then I make an Inverted Gray Scale of the Orig Image, and do a weighted Blend
of the Gradient, and the Gray Scale.

the Find edges get inverted and Feathered as an Edge Fall off to give the shapes edge Definition

finally this pseudo Depth Map is applied as a Pixel Shift where White is high and Black is Low.

I frame buffer up to 4 frames
and pop them off the stack to use as the Alternate EYE image
So that each eye has a slightly different Shifted Image.

It's all just a lot of weighted Math.
combined with a bit of trial and error..

I've tried to Video while I'm doing the On the Fly Conversion, but my Systems just Doesn't have the Balls to Do both at the Same time.

PS
Using what I learned, I did my Best to incorporate some of the in my Latest Scene
There is am extra Folder with Two additional Scenes
These apply my 2Dto3D shader
the results are just ok, not great, not terrible

The Shader provides for some debug Views to look at the Edges, and the Gradient
The Problem with a shader is that It's a ONE pass process.
I can not find edges, and Store them to later reuse with a Flood Fill.
Edge Detection is along a Single Horizontal Scan Line
and then I can Fill in between these Edge Pixels, with a Gradient Line of Pixels.
Repeat for the Next Row
then Stack all the rows together and you get a Single Pass
But the Gradient isn't the Same As if you Had the Shape Outline, and did a Flood fill to get a Smooth coherent Gradient.

What you get, is a Jagged gradient, which with a blur can be sort of blended.
Wyldanimal
MODERADOR
De em Mar 2008

4175 post(s)
Faz 23 horas
Ok I attempted to record a video of doing 2Dto3D on the Fly conversion.
https://wyldanimal.com/2Dto3D/2dto3D-demo-002.mp4

I also
made 54 different test videos

You Have to Download them to watch them...
is-001_3D.mp4 to is-054_3D.mp4
just change the 3 Digit number in the Link
https://wyldanimal.com/2Dto3D/is-001_3D.mp4

https://wyldanimal.com/2Dto3D/is-001_3D.mp4

PS, I'm hoping that OWL3D will soon have a live version
Their AI keeps getting Faster...


Calgon
De em May 2022

425 post(s)
Faz 17 horas
@WA

That's very intersting - thanks for sharing all of that. I tried to get python to work with CUDA for a different project but I couldn't get it to compile - tried many times and gave up.

If I understand what you've done correctly, you've taken a different route to me to get a 3D effect.

A few questions....

Are you taking the whole feed from a non SBS screen and converting it to SBS with the depth processing ? i.e you're not starting from a SBS view of iStripper ?

Doe the .py program analyse the whole frame and say.... looking at the gradient and edges, I will move this pixel n steps ? If it does and you get gaps, how do you fill them ?

I'm assuming this has no concept of depth from shape, so for example the side walls of a room aren't given any extra depth other than that derived from their gradient and edge effects ?
Wyldanimal
MODERADOR
De em Mar 2008

4175 post(s)
Faz 12 horas
Are you taking the whole feed from a non SBS screen and converting it to SBS with the depth processing ? i.e you're not starting from a SBS view of iStripper ?

Correct, I'm taking the Entire 2D view, 2D background and 2D model, exactly what is seen in the OBS view a screen 2.
I capture a frame from Screen 2, apply my 3D transition, Pixel Shift the Right eye
Display the Capture as the Left Eye and the Pixel shifted image as the right eye.
I also do Frame buffering, so that the Left eye is one Frame or multiple Frames ahead of the Right Eye.
A Zero Frame Buffer Means, the 3D effect is Purely from the Applied Gradient.

A One frame Buffer, means that the Right eye, has Pixels from the Previous frame included.

"Doe the .py program analyse the whole frame and say.... looking at the gradient and edges, I will move this pixel n steps ? If it does and you get gaps, how do you fill them ?"

yes the entire frame is analyzed.
A blur to the overall gradient is applied, the edges are softened and applied to give edge definition.
It's N steps times the Shift Amount. The Gray value of the Gradient controls N, and the Shift Amount is the Multiplier.
Shift can be Positive or Negative, and the Gradient can also Be Inverted.

I'm assuming this has no concept of depth from shape, so for example the side walls of a room aren't given any extra depth other than that derived from their gradient and edge effects ?

I start at the Bottom, and Scan upwards, each Edge is assigned a gray Gradient based on how far from the bottom it is.

Gradients have a color Range, say 1 to 10, and then the Next Level Gradient has a Gap say of 30
so then the very Next gray level would start at 40 10 + gap of 30 = 40 to 50
So these two levels would be
Gradient 1 is 1 to 10
Gap of 30
Gradient 2 is 40 to 50
Gap of 30
Gradient 3 is 80 to 90
repeat
After all the Gradients are built, an Over all blur is applied, and then the soft Edge detections are blended back in
This becomes the Pseudo Depth Map
Depending on the Mode...
a Gray Scale of the Captured Frame is also Blended in or not..

Most All of these can be tweaked on the fly.

In addition to the Live 2Dto3D
I included a File Conversion
where the Video Frames are read from a video File, conversion applied, and then displayed, And / Or Saved to an Ouput File.





Calgon
De em May 2022

425 post(s)
Faz 5 horas
@WA

Thanks for sharing all of that. There's lots going on here but I think it's only you and me that seem to be interested in this mind blowing tech. lol.

Here's a new demo clip of where I've got to with my own 3D shader. The last one of these I posted was downloaded 63 times without a single comment from anyone - so I kind of gave up on the forum at that point and decided that I would remain the only person in the world viewing iStripper in this amazing way.

Just to be clear this is a realtime shader, no post processing.

Anyone interested (including @Totem) pls respond.

https://bit.ly/4pjP9ki

Wyldanimal
MODERADOR
De em Mar 2008

4175 post(s)
Faz 3 horas
I went back to look, and I saw that I did miss one of your Scenes..
(8K)(SBS) Gallery

Is you current Demo video Made using that Scene or do you have an updated Scene you are using?

I had some trouble running it on my system, but saw enough to Like it...

I couldn't play the Scene Plus view it with Virtual Desktop at the same time.
So I just screen captured it running, and then Played the video in the Headset..

Calgon
De em May 2022

425 post(s)
Faz 2 horas
It's a while since I produced this demo but I haven't uploaded the .fsh so if you are looking at one of my .scn files then that's OLD OLD and just has a similar name.

What is in the demo .mpg is the 3D enhancement to the models not the backgrounds. That's the key thing here.... models with texture. Hooters that hoot, clothes that have wrinkles oh and of course wrinkly bits with wrinkles.

I use virtual desktop's video player to view this with no issues.




Você ainda não está autorizado a participar

Como usuário gratuito iStripper, você não tem permissão para responder a um tópico no fórum ou criar um novo tópico
Mas você poderá acessar as categorias e conceitos básicos e entrar em contato com nossa comunidade.!