简体   繁体   中英

How to plot a 3D point on a VR (virtual reality) image

I'm trying to understand the formula to go from a 3D point (x, y, z) in the world to a virtual representation on a 2D image.

I'm using this page as an example:

http://dynamic.pulselive.com/dynamic/client/espn/tennis/presets/

在此处输入图片说明

How do I get the (a, b) position from the real (x, y, z) location of the ball?

If you inspect the page, there is a lot of interesting information:

  • the page uses this 3D library http://raphaeljs.com/

  • the balls are represented by (x,y,0) locations such as for the "VR camera [1-4]" views:

     this.graphs['vr-2'].setData( {"names":["PLAYER A","PLAYER B"],"sets":[0,0],"data":{"1.1.1":{ "serverIndex":0, "serveType":"f", "servePlacement":{"x":0,"y":0}, "returnStrike":{"z":0.9144,"x":0,"y":0}, "placements":[[ {"x":0,"y":0}, {"x":-6.4008,"y":0}, {"x":-11.8872,"y":5.4864}, {"x":-11.8872,"y":-5.4864}, {"x":11.8872,"y":5.4864}, {"x":11.8872,"y":-5.4864} ],[]],"strikes":[[],[]]},}} ); 
  • the middle of the net seems to be (0, 0, 0)

  • In the example above, the four balls are:

     -11.8872, 5.4864, 0 (the ball on the tennis double line) 0, 0, 0 (the ball in the middle of the net) 6.4008, 0, 0 (the ball at the T serve line) -11.8872,-5.4864, 0 (the ball on the other tennis double line) 
  • the file projection.js has the following content:

     onProjections([{ "sp":{"height":290,"width":288,"y":1.57271656153463,"p":-2.37960648536682,"tx":44.8464085506826,"ty":0.072299872443644,"tz":43.6183372879028,"r":0.0,"ar":1.0,"fl":1876.41418457031,"cx":144,"cy":145}, "rsp":{"height":290,"width":288,"y":1.56934654388598,"p":-2.26160676579457,"tx":32.6921108381257,"ty":0.227987701481753,"tz":19.2898811340332,"r":0.0,"ar":1.0,"fl":557.320129394531,"cx":144,"cy":145}, "p":{"height":290,"width":288,"y":1.57234654402847,"p":-2.01060693582986,"tx":40.2668354447684,"ty":0.0886826017731878,"tz":15.4698811340333,"r":0.0,"ar":1.0,"fl":1027.31091308594,"cx":144,"cy":145}, "vr-4":{"height":290,"width":578,"y":-3.14002369728572,"p":-3.14159274101257,"tx":0.0720691962205988,"ty":-0.237841598739008,"tz":144.869881134034,"r":0.0,"ar":1.0,"fl":1735.24694824219,"cx":289,"cy":145}, "vr-3":{"height":290,"width":578,"y":3.13971663596315,"p":-1.80660665035248,"tx":-0.0380399327222413,"ty":21.4384628442103,"tz":8.04833728790282,"r":0.0,"ar":1.0,"fl":333.111877441406,"cx":289,"cy":145}, "vr-2":{"height":290,"width":578,"y":2.14571658875068,"p":-1.86560642719269,"tx":27.0765197103578,"ty":16.2179742216193,"tz":10.4983372879028,"r":0.0,"ar":1.0,"fl":445.122161865234,"cx":289,"cy":145}, "vr-1":{"height":290,"width":578,"y":1.56571656120214,"p":-1.88360643386841,"tx":37.1591340096487,"ty":-0.0542132391350438,"tz":10.9683372879027,"r":0.0,"ar":1.0,"fl":787.308471679688,"cx":289,"cy":145}, "rhp":{"height":290,"width":288,"y":1.57134654398098,"p":-2.25760674476624,"tx":31.4921270358275,"ty":0.153327426083581,"tz":18.0298811340332,"r":0.0,"ar":1.0,"fl":557.320129394531,"cx":144,"cy":145} }]); 

I think that my question boils down to how do I use the data in projection.js to calculate the (a, b) position from the (x, y, z) location? Note that I'm trying to get the formula and I'm not so interested by the javascript library as I will have to do an implementation with different tools.

EDIT #1:

The view data seems to include: height, width, y (yaw), p (pan), tx, ty, tz (camera position), r (roll), ar (???, always 1), fl (focal length???), cx, cy (center on image????).

This page gives a formula but it doesn't include the yaw:

http://freespace.virgin.net/hugo.elias/routines/3d_to_2d.htm

    procedure 3Dto2D (x, y, z, pan, centre, position)

    x = x + position.x
    y = y + position.y
    z = z + position.z

    new.x = x*cos(pan.x) - z*sin(pan.x)
    new.z = x*sin(pan.x) + z*cos(pan.x)
    new.y = y*cos(pan.y) - new.z*sin(pan.y)
    z = new.y*cos(pan.y) - new.z*sin(pan.y)
    x = new.x*cos(pan.z) - new.y*sin(pan.z)
    y = new.x*sin(pan.z) + new.y*cos(pan.z)

    if z > 0 then
        screen.x = x / z * zoom + centre.x
        screen.y = y / z * zoom + centre.y
    end if

If I consider the ball on the tennis double line for the VR Camera 2, I have the following information:

REAL WORLD POSITION (from html source)
X:  -11.8872
Y:  -5.4864
Y:  0.0

VR CAMERA 2 INFORMATION (from javascript projection.js)
height: 290
width:  578
y:  2.14571658875068
p:  -1.86560642719269
tx: 27.0765197103578
ty: 16.2179742216193
tz: 10.4983372879028
r:  0.0
ar: 1.0
fl: 445.122161865234
cx: 289
cy: 145

2D LOCATION (from live inspection of page in browser inspector)
A:  121.9715646498276
B:  193.3313740587937

What is the function (A, B) = f(X, Y, Z, height, width, y, p, tx, ty, tz, r, ar, fl, cx, cy)?

Yaw is calculated:

x_new = x * cos (yaw) - y * sin (yaw)
y_new = x * sin (yaw) + y * cos (yaw)

http://thundaxsoftware.blogspot.fi/2012/01/projecting-3d-points-to-2d-screen.html?m=1

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM