fish_3d package¶
Submodules¶
fish_3d.camera module¶
- class fish_3d.camera.Camera¶
Bases:
object
This class holds intrinsic/extrinsic parameters of a camera, with some handy functions
- self.r¶
rotation matrix of the camera, shape (3, 3)
- self.ext¶
extrinsic parameters, # R, t –> [R|t]
- self.p¶
projection matrix int @ ext
- self.o¶
origin of the camera, shape (3, 1)
- calibrate(int_images: list, ext_image: str, grid_size: float, order='x123', corner_number=(6, 6), win_size=(5, 5), show=True)¶
update intrinsic and extrinsic camera matrix using opencv’s chessboard detector the distortion coefficients are also being detected the corner number should be in the format of (row, column)
- calibrate_ext(ext_image: str, grid_size: float, order='x123', corner_number=(6, 6), win_size=(5, 5), show=True)¶
update EXTRINSIC camera matrix using opencv’s chessboard detector the distortion coefficients are also being detected the corner number should be in the format of (row, column)
- calibrate_int(int_images: list, grid_size: float, corner_number=(6, 6), win_size=(5, 5), show=True)¶
update INTERINSIC camera matrix using opencv’s chessboard detector the distortion coefficients are also being detected the corner number should be in the format of (row, column)
- load_json(fname)¶
Recover essential parameters from a json file
- project(position: numpy.array)¶
Project a 3D position onto the image plane
- Parameters
position (np.ndarray) – a collection of 3D poitns, shape (n, 3)
- project_refractive(positions)¶
Project the 3D points under water the normal water-air interface is assumed to be (0, 0, 1)
- Parameters
positions (
numpy.ndarray
) – a collection of 3d points, shape (n, 3)- Returns
the projected 2D locations, shape (n, 2)
- Return type
- read_calibration(mat_file: str)¶
Read calibration result from TOOLBOX_calib The calibration result is generated by following Matlab script:
save(filename, 'fc', 'cc', 'Tc_ext', 'Rc_ext');
- read_int(another)¶
- redistort_points(points: numpy.array)¶
- Parameters
points (np.ndarray) – undistorted image coordinates (u, v) in pixels, shape (2, n)
- Returns
the distorted points
- Return type
np.ndarray
- save_json(fname)¶
Dump essential parameters to a json file
- undistort(point: numpy.array, want_uv=False)¶
undistort point in an image, coordinate is (u, v)
- Parameters
point (np.ndarray) – the points to be undistorted
want_uv (bool) –
- Returns
undistorted points, being xy, not uv (camera.K @ xy = uv)
- Return type
np.ndarray
- undistort_image(image: numpy.array)¶
- Parameters
image (np.ndarray) – an image taken by the camera
- Returns
a undistorted 2D image
- Return type
np.ndarray
- undistort_points(points: numpy.array, want_uv=True)¶
Undistort many points in an image, coordinate is (u, v), NOT (x, y)
- Returns
undistorted version of (x’, y’) or (u’, v’), shape (n, 2)
- Return type
np.ndarray
x' * fx + cx -> u'
- unzip_essential(data)¶
Unpack essential parameters from a dict and load them as attributes
- update()¶
- zip_essential()¶
- pack all essential parameters into a dict which can be dumped
as a json file
- fish_3d.camera.calib_mult_ext(cam_1, cam_2, cam_3, images_v1, images_v2, images_v3, orders_v1, orders_v2, orders_v3, grid_size, corner_number, win_size=(10, 10), debug=False)¶
Do extrinsic calibrations multiple times, calculate average relative displacement & angle Then use the last calibration image as world coordinate
- Parameters
cam_1 (Camera) – The internal parameters should be correct
cam_2 (Camera) – The internal parameters should be correct
cam_3 (Camera) – The internal parameters should be correct
images_v1 (list) – calibrations image filenames for camera 1
images_v2 (list) – calibrations image filenames for camera 2
images_v3 (list) – calibrations image filenames for camera 3
orders_v1 (list) – The order of ‘x123’ or ‘321x’ for each calibration imags for camera 1
orders_v2 (list) – The order of ‘x123’ or ‘321x’ for each calibration imags for camera 2
orders_v3 (list) – The order of ‘x123’ or ‘321x’ for each calibration imags for camera 3
grid_size (float) – size of a single grid in chessboard, only support square grids
corner_number (tuple) – (horizontal, vertical), specifying the number of corners in each direction corner_number = grid_number - 1
win_size (tuple) – Search parameter in the opencv
Equations (@ is dot product)
x1 = r1 @ xw + t1 (world to camera 1, r1 & t1 obtained from Camera.calibrate_ext) x2 = r2 @ xw + t2 (world to camera 2, r2 & t2 obtained from Camera.calibrate_ext) x2 = r12 @ x1 + t12 (camera 1 to camera 2) --> r12 = r2 @ r1' --> t12 = t2 - r2 @ r1' @ t1 --> t2 = t12 + r12 @ t1 --> r2 = r12 @ r1
- fish_3d.camera.detect_chessboard(image, corner_number, win_size=5)¶
Find the corners on a chessboard
- fish_3d.camera.draw(img, axes)¶
- fish_3d.camera.find_pairs(arr_1, arr_2)¶
- fish_3d.camera.get_fundamental(cam_1: fish_3d.camera.Camera, cam_2: fish_3d.camera.Camera)¶
Get the fundamental matrix between two cameras
In the actually calculation, rotate axes of camera 2, so that external of camera 1 is
[E|0]
(ref: https://sourishghosh.com/2016/fundamental-matrix-from-camera-matrices/)
- fish_3d.camera.get_points_from_order(corner_number, order='x123')¶
the expected calibration image is
┌───────┬───────┬───────┐ │ ╲ ╱ │◜◜◜◜◜◜◜│ ╮ │ │ ╳ │◜◜◜◜◜◜◜│ │ │ │ ╱ ╲ │◜◜◜◜◜◜◜│ ┴ │ ├───────┼───────┼───────┤ │◜◜◜◜◜◜◜│ │◜◜◜◜◜◜◜│ │◜◜◜◜◜◜◜│ │◜◜◜◜◜◜◜│ │◜◜◜◜◜◜◜│ │◜◜◜◜◜◜◜│ ├───────┼───────┼───────┤ │ ╶─╮ │◜◜◜◜◜◜◜│ ──┐ │ │ ╭─╯ │◜◜◜◜◜◜◜│ ╶─┤ │ │ ╰── │◜◜◜◜◜◜◜│ ──┘ │ └───────┴───────┴───────┘
and the corresponding order is x123 (row, colume)
- fish_3d.camera.get_reproject_error(points_2d, points_3d, rvec, tvec, distort, camera_matrix)¶
- fish_3d.camera.plot_cameras(axis, cameras, water_level=0, depth=400)¶
fish_3d.cgreta module¶
GReTA tracking
- fish_3d.cgreta.get_trajs_3d(frames_v3: List[List[numpy.ndarray[numpy.float64[m, 2]]][3]], stereo_links: List[List[Tuple[int, int, int, float]]], project_matrices: List[numpy.ndarray[numpy.float64[3, 4]][3]], camera_origins: List[numpy.ndarray[numpy.float64[3, 1]][3]], c_max: float, search_range: float, re_max: float) List[Tuple[numpy.ndarray[numpy.float64[m, 3]], float]] ¶
- fish_3d.cgreta.get_trajs_3d_t1t2(frames_v3: List[List[numpy.ndarray[numpy.float64[m, 2]]][3]], stereo_links: List[List[Tuple[int, int, int, float]]], project_matrices: List[numpy.ndarray[numpy.float64[3, 4]][3]], camera_origins: List[numpy.ndarray[numpy.float64[3, 1]][3]], c_max: float, search_range: float, search_range_traj: float, tau_1: int, tau_2: int, re_max: float) List[Tuple[numpy.ndarray[numpy.float64[m, 3]], float]] ¶
- fish_3d.cgreta.get_trajs_3d_t1t2t3(frames_v3: List[List[numpy.ndarray[numpy.float64[m, 2]]][3]], stereo_links: List[List[Tuple[int, int, int, float]]], project_matrices: List[numpy.ndarray[numpy.float64[3, 4]][3]], camera_origins: List[numpy.ndarray[numpy.float64[3, 1]][3]], c_max: float, search_range: float, search_range_traj: float, tau_1: int, tau_2: int, tau_3: int, re_max: float) List[Tuple[numpy.ndarray[numpy.float64[m, 3]], float]] ¶
fish_3d.cray_trace module¶
refractive ray tracing
- fish_3d.cray_trace.get_intersect_multiple(lines: numpy.ndarray[numpy.float64[m, 18], flags.c_contiguous]) numpy.ndarray[numpy.float64[m, 3]] ¶
calculate the points that are closest to a collection of multiple lines
- fish_3d.cray_trace.get_intersect_of_lines(lines: numpy.ndarray[numpy.float64]) numpy.ndarray[numpy.float64] ¶
calculate the point that are closest to multiple lines
- fish_3d.cray_trace.get_intersect_single(lines: numpy.ndarray[numpy.float64[3, 6], flags.c_contiguous]) numpy.ndarray[numpy.float64[1, 3]] ¶
calculate the point that is closest to multiple lines
fish_3d.cstereo module¶
stereo matching & greta tracking
- fish_3d.cstereo.get_error(centres: List[numpy.ndarray[numpy.float64[2, 1]][3]], Ps: List[numpy.ndarray[numpy.float64[3, 4]][3]], Os: List[numpy.ndarray[numpy.float64[3, 1]][3]]) float ¶
- fish_3d.cstereo.locate_v3(centres_1: numpy.ndarray[numpy.float64[m, 2]], centres_2: numpy.ndarray[numpy.float64[m, 2]], centres_3: numpy.ndarray[numpy.float64[m, 2]], P1: numpy.ndarray[numpy.float64[3, 4]], P2: numpy.ndarray[numpy.float64[3, 4]], P3: numpy.ndarray[numpy.float64[3, 4]], O1: numpy.ndarray[numpy.float64[3, 1]], O2: numpy.ndarray[numpy.float64[3, 1]], O3: numpy.ndarray[numpy.float64[3, 1]], tol_2d: float, optimise: bool = True) Tuple[numpy.ndarray[numpy.float64[m, 3]], numpy.ndarray[numpy.float64[m, 1]]] ¶
- fish_3d.cstereo.match_v3(centres_1: numpy.ndarray[numpy.float64[m, 2]], centres_2: numpy.ndarray[numpy.float64[m, 2]], centres_3: numpy.ndarray[numpy.float64[m, 2]], P1: numpy.ndarray[numpy.float64[3, 4]], P2: numpy.ndarray[numpy.float64[3, 4]], P3: numpy.ndarray[numpy.float64[3, 4]], O1: numpy.ndarray[numpy.float64[3, 1]], O2: numpy.ndarray[numpy.float64[3, 1]], O3: numpy.ndarray[numpy.float64[3, 1]], tol_2d: float, optimise: bool = True) List[Tuple[int, int, int, float]] ¶
- fish_3d.cstereo.match_v3_verbose(centres_1: numpy.ndarray[numpy.float64[m, 2]], centres_2: numpy.ndarray[numpy.float64[m, 2]], centres_3: numpy.ndarray[numpy.float64[m, 2]], P1: numpy.ndarray[numpy.float64[3, 4]], P2: numpy.ndarray[numpy.float64[3, 4]], P3: numpy.ndarray[numpy.float64[3, 4]], O1: numpy.ndarray[numpy.float64[3, 1]], O2: numpy.ndarray[numpy.float64[3, 1]], O3: numpy.ndarray[numpy.float64[3, 1]], tol_2d: float, optimise: bool = True) Tuple[List[Tuple[int, int, int, float]], numpy.ndarray[numpy.float64[m, 3]], numpy.ndarray[numpy.float64[m, 1]]] ¶
- fish_3d.cstereo.refractive_project(points: numpy.ndarray[numpy.float64[m, 3]], P: numpy.ndarray[numpy.float64[3, 4]], O: numpy.ndarray[numpy.float64[3, 1]]) numpy.ndarray[numpy.float64[m, 2]] ¶
- fish_3d.cstereo.refractive_triangulate(C1: numpy.ndarray[numpy.float64[m, 2]], C2: numpy.ndarray[numpy.float64[m, 2]], C3: numpy.ndarray[numpy.float64[m, 2]], P1: numpy.ndarray[numpy.float64[3, 4]], P2: numpy.ndarray[numpy.float64[3, 4]], P3: numpy.ndarray[numpy.float64[3, 4]], O1: numpy.ndarray[numpy.float64[3, 1]], O2: numpy.ndarray[numpy.float64[3, 1]], O3: numpy.ndarray[numpy.float64[3, 1]]) numpy.ndarray[numpy.float64[m, 3]] ¶
fish_3d.ctemporal module¶
a 2D linking modules
fish_3d.cutility module¶
helper functions optimised in cpp
fish_3d.ellipse module¶
Some helper function I wrote to reconstruct ellipses from three views This is to calculate the orientation of my fish tank
- fish_3d.ellipse.cost_conic(RT, K, C, p2d, p3dh)¶
Cost function to incorporate the conic measurement into camera calibration. It is aimed to optimise the result of the cv2.solvePnP function. The conic in the image should correspond to a circle in 3D @ plane Z=0.
- Parameters
RT (np.ndarray) – the rotation and translation to be optimised. shape (6, )
K (np.ndarray) – the intrinsic camera matrix, shape (3, 3)
C (np.ndarray) – the matrix for the measured conic, shape (3, 3)
p2d (np.ndarray) – 2d features for the solvePnP method, shape (n, 2)
p3dh (np.ndarray) – the homogeneous representation of 3d positions for the solvePnP method, shape (n, 4)
- Returns
the geometric distance
- Return type
- fish_3d.ellipse.cost_conic_triple(RT123, K1, K2, K3, C1, C2, C3, p2d1, p2d2, p2d3, p3dh)¶
Cost function to incorporate the conic measurement into camera calibration. It is aimed to optimise the result of the cv2.solvePnP function. The conic in the image should correspond to a circle in 3D @ plane Z=0.
- Parameters
RT123 (numpy.ndarray) – the rotation and translation to be optimised, shape (18,)
K1 (numpy.ndarray) – the calibration matrix \(\in \mathbb{R}^{3 \times 3}\) of camera 1
K2 (numpy.ndarray) – the calibration matrix \(\in \mathbb{R}^{3 \times 3}\) of camera 2
K3 (numpy.ndarray) – the calibration matrix \(\in \mathbb{R}^{3 \times 3}\) of camera 3
C1 (numpy.ndarray) – the conic matrix from camera 1, shape (3, 3)
C2 (numpy.ndarray) – the conic matrix from camera 2, shape (3, 3)
C3 (numpy.ndarray) – the conic matrix from camera 3, shape (3, 3)
p2d1 (numpy.ndarray) – the 2d features for the solvePnP method from camera 1, shape (n, 2)
p2d2 (numpy.ndarray) – the 2d features for the solvePnP method from camera 2, shape (n, 2)
p2d3 (numpy.ndarray) – the 2d features for the solvePnP method from camera 3, shape (n, 2)
p3dh (numpy.ndarray) – the homogeneous representation of 3d positions for the solvePnP method, shape (n, 4)
- Returns
the geometric distance
- Return type
- fish_3d.ellipse.draw_ellipse(angle: numpy.ndarray, ellipse: List[float]) numpy.ndarray ¶
return (u, v) coordinates for plotting no need to use np.flip to plot with figure
- fish_3d.ellipse.find_projection(ellipse, line)¶
find the point on an ellipse that is closest to a line :param ellipse: represented as (al, bl, xc, yc, rot), the geometrical form :param line: represented as (a1, a2, a3) where l1 x + l2 y + l3 == 0 I am sorry for the inconsistent notations
- fish_3d.ellipse.get_conic_coef(xc, yc, a, b, rotation)¶
Represent an ellipse from geometric form (a, b, x_centre, y_centre, rotation) to the algebraic form (α x^2 + β x y + γ y^2 + δ x + ε y + η)
The parameters are consistense with the parameters of skimage.measure.EllipseModel
- fish_3d.ellipse.get_conic_matrix(conic_coef)¶
Assemble the conic coefficients into a conic matrix (AQ)
See the wiki page
- Parameters
conic_coef (iterable) – a container with 6 numbers
- Returns
the conic matrix
- Return type
np.ndarray
- fish_3d.ellipse.get_geometric_coef(matrix)¶
Calculate the geometric coefficients of the conic from a conic Matrix:
$$ \begin{array}{c} A & B/2 & D/2 \\ B/2 & C & E/2 \\ D/2 & E/2 & F \end{array} $$
The conic form is written as,
$$ A x^2 + B xy + C y^2 + D x + E y + F = 0 $$
Reference: Wikipedia
- Parameters
matrix (np.ndarray) – the conic matrix, being AQ in the wiki.
- Returns
(xc, yc, a, b, rotation)
- Return type
- fish_3d.ellipse.get_intersection(ellipse, line)¶
Finding the intersection between an ellipse and a line * If there is no intersection, then find the point on the ellipse that is closest to the line.
- Parameters
ellipse (
numpy.ndarray
) – represented as (a, b, xc, yc, rot), the geometrical formline (
numpy.ndarray
) – represented as (l1, l2, l3) where l1 x + l2 y + l3 == 0
- fish_3d.ellipse.match_ellipse_sloopy(cameras: List[Camera], ellipses: List[List[float]], N: int, min_diff=250, max_cost=10)¶
1. Randomly choose N points in view 1 3. For every chosen point (P1) 1. Calculate POI of P1 in other two views, get POI2 & POI3 2. For POI2, calculate its projection on C2, get P2 (using camera's information) 3. For POI3, calculate its projection on C3, get P3 4. Reconstruct 3D point using P1, P2, & P3 4. These points should be points on the surface
- fish_3d.ellipse.parse_ellipses_imagej(csv_file)¶
the ellipse expression corresponds to (u, v) coordinate system
fish_3d.ray_trace module¶
- fish_3d.ray_trace.cost_snell(xy, z, location, origin, normal, refractive_index)¶
- fish_3d.ray_trace.epipolar_draw(uv, camera_1, camera_2, image_2, interface=0, depth=400, normal=(0, 0, 1), n=1.33)¶
return all the pixels that contains the epipolar line in image_2, re-projected & distorted
The meanings of variable names can be found in this paper
- Parameters
uv – (u, v) location of a pixel on image from camera_1
interface – height of water level in WORLD coordinate system
normal – normal of the air-water interface, from water to air
n – refractive index of water (or some other media)
- fish_3d.ray_trace.epipolar_la(uv, camera_1, camera_2, interface=0, depth=400, normal=(0, 0, 1), n=1.33)¶
linear approximation for epipolar line under water the line path through two UNDISTORTED projection points, one at the interface one below water use 5 epipolar points under water and do a linear fit
- fish_3d.ray_trace.epipolar_la_draw(xy, camera_1, camera_2, image_2, interface=0, depth=400, normal=(0, 0, 1), n=1.33)¶
linear approximation for epipolar line under water use 3 epipolar points under water and do a linear fit
- fish_3d.ray_trace.epipolar_refractive(uv, camera_1, camera_2, image_2, interface=0, normal=(0, 0, 1), step=1, n=1.33)¶
- The meanings of variable names can be found in this paper
10.1109/CRV.2011.26
- Parameters
uv – (u, v) location of a pixel on image from camera_1, NOT (x, y)
interface – height of water level in WORLD coordinate system
normal – normal of the air-water interface, from water to air
n – refractive index of water (or some other media)
- Here the goal is:
For given pixel (u, v) in image taken by camera #1, calculated it’s projection on air-water interface (poi_1), and the direction of the refractted ray (trans_vec)
- While the projection is still on image from camera_2:
Calculated the position (M) from poi_1 going along trans_vec
Project M onto camera #2
Collect the projection points
- fish_3d.ray_trace.find_u(u, n, d, x, z)¶
- The meanings of variable names can be found in this paper
10.1109/CRV.2011.26
- fish_3d.ray_trace.get_intersect_of_lines(lines)¶
- Parameters
lines (np.ndarray) – a collection of different lines, …], shape -> (n, 2, 3) line = [points (a), unit directions (v)]
- fish_3d.ray_trace.get_intersect_of_lines_slow(lines)¶
Calculate intersecting point of many lines
- Parameters
- Returns
the a point in 3D whose distances sum to all lines is minimum
- Return type
np.ndarray
(I followed this answer: https://stackoverflow.com/a/48201730/4116538)
- fish_3d.ray_trace.get_poi(camera: Camera, z: float, coordinate: numpy.ndarray)¶
Calculate the poi, point on interface. See the sketch for its meaning
camera / / / ---------[poi]-------- air-water interface (z = z) | | | fish
For the calculation, the equation: P @ [x, y, z, 1]’ = [c * v, c * u, c]’ are solved for x, y, c knowing everything else
- Parameters
- Returns
the points on interface, [X, Y, Z], shape (3, n)
- Return type
np.ndarray
- fish_3d.ray_trace.get_poi_cluster(camera, z, coordinate)¶
- fish_3d.ray_trace.get_reproj_err(point_3d, points_2d, cameras, water_level, normal)¶
- fish_3d.ray_trace.get_trans_vec(incident_vec, refractive_index=1.33, normal=(0, 0, 1))¶
get the unit vector of transmitted ray from air to water
- Parameters
incident_vec (np.ndarray) – the vector representing the incident ray
refractive_index (float) – the refractive index of water is 1.33 @ 25°C
normal (np.ndarray) – the normal vector facing up, the coordinate is looks like
^ +z air | | ------------------> | water | |
- fish_3d.ray_trace.get_trans_vecs(incident_vecs, refractive_index=1.33, normal=(0, 0, 1))¶
Get the many unit vectors of transmitted rays from air to water
- Parameters
incident_vecs (np.ndarray) – vectors representing many incident rays, shape (n, 3)
refractive_index (float) – the refractive index of water is 1.33 @ 25°C
normal (np.ndarray) – the normal vector facing up, the coordinate is looks like
^ +z air | | ------------------> | water | |
- fish_3d.ray_trace.get_u(n, d, x, z)¶
n - relative refractive index d, x, z - length in mm
- fish_3d.ray_trace.is_inside_image(uv, image)¶
- fish_3d.ray_trace.pl_dist(point, line)¶
Calculate distance between a point and a line in 3D
- Parameters
point (np.ndarray) – shape (3, )
line (dict) – {‘unit’: direction, ‘base’: start_point}
- Returns
the distance between a line and a point
- Return type
np.ndarray
- fish_3d.ray_trace.pl_dist_batch(points, lines)¶
Calculate distance between many points and many lines in 3D
- Parameters
points (np.ndarray) – shape (n, 3,)
lines (np.ndarray) – shape (n, view, 2, dim) [dim = 3]
- Returns
many distances, shape (n, )
- Return type
np.array
- fish_3d.ray_trace.pl_dist_faster(point, lines)¶
Calculate distance between a point and a line in 3D
- Parameters
point (np.ndarray) – shape (3,)
line (dict) – {‘unit’: direction, ‘base’: start_point}
- Returns
the distance between a line and a point
- Return type
np.ndarray
- fish_3d.ray_trace.py_get_intersect_of_lines_batch(lines)¶
- Parameters
lines (np.ndarray) – a collection of many matched lines, shape -> (n, view, 2, 3)
- Returns
n points in 3D as the intersect of n * view lines
- Return type
np.ndarray
- fish_3d.ray_trace.ray_trace_refractive(centres, cameras, z=0, normal=(0, 0, 1), refractive_index=1.33)¶
- Parameters
centres – a list of centres (u, v) in different views, they should be undistorted
cameras – a list of calibrated Camera objects
z – the z-value of the refractive interface
- fish_3d.ray_trace.ray_trace_refractive_cluster(clusters, cameras, z=0, normal=(0, 0, 1), refractive_index=1.33)¶
- fish_3d.ray_trace.ray_trace_refractive_faster(centres, cameras, z=0, normal=(0, 0, 1), refractive_index=1.33)¶
- Parameters
centres – a list of centres (u, v) in different views, they should be undistorted
cameras – a list of calibrated Camera objects
z – the z-value of the refractive interface
- fish_3d.ray_trace.ray_trace_refractive_trajectory(trajectories: List[numpy.ndarray], cameras: List[Camera], z=0, normal=(0, 0, 1), refractive_index=1.33)¶
reconstruct a trajectory from many views :param trajectories: list of positions belonging to the same individual at different time points, shape (time_points, dim)
- fish_3d.ray_trace.reproject_refractive(xyz, camera, water_level=0, normal=(0, 0, 1), refractive_index=1.333)¶
variable names follwoing https://ieeexplore.ieee.org/document/5957554, figure 1
- fish_3d.ray_trace.reproject_refractive_no_distort(xyz, camera, water_level=0, normal=(0, 0, 1), refractive_index=1.333)¶
- fish_3d.ray_trace.same_direction(v1, v2, v3, axis)¶
fish_3d.stereolink module¶
- fish_3d.stereolink.extra_three_view_cluster_match(matched_indices, clusters_multi_view, cameras, tol_2d: float, sample_size: int, depth: float, report=False, normal=(0, 0, 1), water_level=0.0)¶
match clusters tanking simutaneously by three cameras pretending only features that DID NOT correspond to indices in matched_indices appear in the view matched_indices: shape (number, view)
- fish_3d.stereolink.get_fundamental_from_projections(p1, p2)¶
p1: projection matrix of camera for image 1, shape (3, 4) p2: projection matrix of camera for image 2, shape (3, 4) F: fundamental matrix, shape (3, 3) image 1 —> image 2 point_1’ * F * point_2 = 0
- fish_3d.stereolink.get_partial_cluster(cluster, size)¶
- fish_3d.stereolink.greedy_match(clusters, cameras, depth, normal, water_level, tol_2d, sample_size=10, report=True, order=(0, 1, 2), history=array([], shape=(0, 3), dtype=float64))¶
Use greedy algorithm to match clusters across THREE views
- Starting from view_1:
- starting from cluster_1:
find all possible correspondance according to epipolar relationship in view 2 and view 3
for all possibility in view 2, validate according to ray-tracing
for all possibility in view 3, validate according to ray-tracing
- Parameters
clusters – A collection of points in the 2D image with the format of (u, v), NOT (x, y)
cameras – A collection of Camera instances
depth – The maximum depth of water used to constrain the length of the epipolar relation
normal – The direction of the normal of the water. It should be [0, 0, 1]
tol_2d – The tolerance on the distance between epipolar line and centers, unit is pixel
points – the number of points used in 3D stereo matching for each cluster, choosing randomly
- Returns
- the matched indices across different views;
the corresponding 3D centres;
the corresponding reprojection errors;
- Return type
- fish_3d.stereolink.join_pairs(pairs)¶
- fish_3d.stereolink.line2func(line)¶
- fish_3d.stereolink.match_clusters(clusters, cameras, normal, water_level)¶
return allowed 3d points given matched clusters in different views the error is the average of perpendicular of 3D points to the three rays, unit is mm
- fish_3d.stereolink.match_points_v3(cameras: List[Camera], points: List[numpy.ndarray], max_cost=1)¶
match points in three views no refractions were considered :param cameras: a list of three cameras :param points: (x, y) coordinates in three views
- fish_3d.stereolink.multi_view_link(c1, c2, f)¶
c1: homogenerous centres in image 1, shape (N, 3) c2: homogenerous centres in image 2, shape (N, 3) supposing centres in image 1 is in the right order (order of c1) c2[order] gives the right order of objects in image 2
- fish_3d.stereolink.remove_conflict(matched_indices, matched_centres, reproj_errors)¶
Only allow each unique feature, indicated by a number in matched_indices, appear once for each view That is to say, fish #1 and fish #2 is NOT allowed to the SAME blob in the same picture. This means all values in matched_indices.T should be unique along three views It works for n views matched_indices: shape (number, view)
- fish_3d.stereolink.remove_overlap(centres, errors, search_range=10)¶
centres: possible fish locations, array shape (number, dimension) errors: reprojection errors of COM of different clouds, shape (number, ) overlap = convex hull overlap; vertices of ch1 goes into ch2
- fish_3d.stereolink.three_view_cluster_match(clusters_multi_view, cameras, tol_2d: float, sample_size: int, depth: float, report=False, normal=(0, 0, 1), water_level=0.0)¶
match clusters tanking simutaneously by three cameras the clusters were assumed to be in water, n = 1.333
- fish_3d.stereolink.three_view_match(features, cameras, tol_2d, normal=(0, 0, 1), water_level=0.0, depth=400)¶
- fish_3d.stereolink.triangulation_v3(positions: numpy.ndarray, cameras: List[Camera])¶
the positions in different views should be undistorted
fish_3d.utility module¶
- fish_3d.utility.box_count_polar_image(image, indices, invert=False, rawdata=False)¶
Calculate the average density inside different regions inside an image
- Parameters
image (numpy.ndarray) – the image taken by the camera without undistortion
indices (numpy.ndarray) – labelled image specifying different box regions
- fish_3d.utility.box_count_polar_video(video, labels, cores=2, report=True, invert=False, rawdata=False)¶
- fish_3d.utility.convert_traj_format(traj, t0)¶
Converting from (positions, error) to (time, positions) The starting & ending NAN will be removed, the NAN in the middile will be replaced by linear interpolation
- Parameters
traj (tuple) – one trajectory, represented by (positions, error)
t0 (int) – the starting frame of this trajectory
- Returns
- a list of ONE trajectory
represented by (time, positions)
- Return type
list of tuple
- fish_3d.utility.draw_fish(positions, ax, size=1)¶
Draw fish shaped scatter on a matplotlib Axes object
- Parameters
positions (np.ndarray) – positions of the 3D points, shape (n, 3)
ax (Axes) – an Axes instance from matplotlib
size (float) – the size of the fish
- Returns
None
- fish_3d.utility.fill_hole_1d(binary, size)¶
Fill “holes” in a binary signal whose length is smaller than size
- Parameters
binary (numpy.ndarray) – a boolean numpy array
size (int) – the holes whose length is smaller than size will be filled
- Returns
the filled binary array
- Return type
numpy.ndarray
Example
>>> binary = np.array([0, 1, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 1]) >>> filled = np.array([0, 1, 1, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1]) >>> np.array_equal(filled, fill_hole_1d(binary, 2)) True
- fish_3d.utility.get_ABCD(corners, width, excess_rows) dict ¶
What is ABCD?
C +-------+ D | | | | | | A +-------+ B
(AB // CD, AC // BD)
- Parameters
corners (numpy.ndarray) – the coordinates of chessboard corners found by
cv2.findChessboardCorners
, shape (n, 2)width (int) – the number of corners in a row
excess_rows (int) – if the chessboard is make of (m, n) corners (m > n) then n is width, excess_rows = m - n
- Returns
the locations of A, B, C, D respectively.
- Return type
dict
- fish_3d.utility.get_affinity(abcd)¶
Getting the affinity matrix from a set of corners measured from a chessboard image
what is ABCD?
C +-------+ D | | | | | | A +-------+ B
- Parameters
abcd (dict) – the measured coordinates of chessboard corners
- Returns
the affine transformation
- Return type
numpy.ndarray
- fish_3d.utility.get_average_euclidean_transform(Rotations, Translations, index)¶
Calculate the averaged relative rotation and translation from different noisy measurements. There is an unknown Euclidean ambiguity between the different measurements.
- Parameters
Rotations (np.ndarray) – shape (n_measure, n_view, 3, 3)
Translations (np.ndarray) – shape (n_measure, n_view, 3)
index (int) – the transforms are bettwen different views and the view specified by the index
- Returns
the relative rotations and translations. The shape of the elements are ((n_view, 3, 3), (n_view, 3))
- Return type
- fish_3d.utility.get_brcs(number=1, bias=(1.0, 0.7, 0.8), brightness=(0.25, 1.0))¶
Get biased random colours
- Parameters
- Returns
the colors, shape (n, 3) or (3) if number==1
- Return type
np.ndarray
- fish_3d.utility.get_cameras_with_averaged_euclidean_transform(cameras, index)¶
Update the extrinsic parameters of the cameras so that the relative rotation and translation between different views were replace by the average of different measurements.
- Parameters
- Returns
a list of cameras whose relative Euclidean transformations were replaced by the average over different measurements. The “shape” of the final result is (n_measure, n_view).
- Return type
- fish_3d.utility.get_clusters(image, threshold, min_size, roi)¶
apply threshold to image and label disconnected part small labels (size < min_size) were erased the coordinates of labels were returned, shape: (label_number, 3)
- fish_3d.utility.get_corners(image: numpy.array, rows: int, cols: int, camera_model=None)¶
use findChessboardCorners in opencv to get coordinates of corners
- fish_3d.utility.get_homography(camera, angle_num=10)¶
Get the homography that simiarly recover the 2d image perpendicular to z-axis
- Parameters
camera (Camera) – a Camera instance of current camera
angle_num (int) – a virtual chessboard is rotated angle_num times for calculation
- fish_3d.utility.get_homography_image(image, rows, cols, camera_model=None)¶
get the homography transformation from an image with a chess-board
- Parameters
image (numpy.ndarray) – a 2d image
rows (int) – the number of internal corners inside each row
cols (int) – the number of internal corners inside each column
camera_model (Camera) – (optional) a Camera instance that stores the distortion coefficients of the lens
- fish_3d.utility.get_indices(labels)¶
- fish_3d.utility.get_optimised_camera_c2c(camera, conic_mat, p2d, p3d, method='Nelder-Mead')¶
Optimise the extrinsic parameter of the camera with a known 3D circle.
- Parameters
camera (Camera) – a calibrated camera, its extrinsic parameters will be used as the initial guess for the optimisation.
conic_mat (numpy.ndarray) – the matrix representation of an ellipse. the parameter should be obtained from an undistorted image.
p2d (numpy.ndarray) – the undistorted 2d locations for the PnP problem, shape (n, 2)
p3d (numpy.ndarray) – the 3d locations for the PnP problem, shape (n, 3)
method (str) – the name of the optimisation mathod. See scipy doc.
- Returns
a new camera with better extrinsic parameters.
- Return type
- fish_3d.utility.get_optimised_camera_triplet_c2c(cameras, conic_matrices, p2ds, p3d, method='Nelder-Mead')¶
- Optimise 3 cameras with 2D-3D correspondances as well as a measured
conics corresponding to a 3D circle on the plane \(Z=0\).
- Parameters
cameras (list) – three
Camera
instancesconic_matrices (list) – three conic matrices with shape (3, 3)
p2ds (list) – three distorted 2d locations whose shape is (n, 2)
p3d (numpy.ndarray) – the 3d locations, shape (n, 3)
method (str) – the method for the non-lienar optimisation
- Returns
- three cameras whose extrinsic parameters were optimised
with the circle-to-conic correspondances.
- Return type
- fish_3d.utility.get_orient_line(locations, orientations, length=10)¶
Get the line for plot the orientations
- Parameters
locations (numpy.ndarray) – shape (n, 2)
orientations (numpy.ndarray) – shape (n, )
- fish_3d.utility.get_overlap_pairs(trajs, num, rtol)¶
- Parameters
trajs (list) – a collections of trajectories, each trajectory is a tuple, containing (positions [N, 3], reprojection_error)
num (int) – the maximum number of allowed overlapped objects
rtol (float) – the minimum distance between two non-overlapped objects
- Returns
the indices of overlapped objects
- Return type
list of tuple
- fish_3d.utility.get_polar_chop_spatial(radius, n_angle, n_radius)¶
- Retrieve the correspondance between the values from
polar_chop
and the radius/angle values.
- Parameters
radius (int) – maximum radius in the polar coordinate system
n_angle (int) – number of bins in terms of angle
n_radius (int) – number of bins in terms of radius
- Returns
{ label_value : (angle, radius), … }
- Return type
dict
- Retrieve the correspondance between the values from
- fish_3d.utility.get_short_trajs(cameras, features_mv_mt, st_error_tol, search_range, t1, t2, z_min, z_max, overlap_num, overlap_rtol, reproj_err_tol, t3=1)¶
Getting short 3D trajectories from 2D positions and camera informations
- Parameters
cameras (Camera) – cameras for 3 views
freatrues_mv_mt (list) – 2d features in different views at different frames
st_error_tol (float) – the stereo reprojection error cut-off for stereo linking
tau (int) – the length of trajectories in each batch; unit: frame the overlap between different batches will be tau // 2
z_min (float) – the minimum allowed z-values for all trajectories
z_max (float) – the maximum allowed z-values for all trajectories
t1 (int) – the time duration in the first iteration in GReTA
t2 (int) – the time duration in the second iteration in GReTA
t3 (int) – the time duration in the third iteration in GReTA
overlap_num (int) – if two trajectories have more numbers of overlapped positions than overlap_num, establish a link.
overlap_rtol (float) – if two trajectories have more numbers of overlapped positions, link them.
reproj_err_tol (float) – the 3d positiosn whose reprojection error is greater than this will not be re-constructed, instead a NAN is inserted into the trajectory
- Returns
trajectories
- Return type
list [ (numpy.ndarray, float) ]
- fish_3d.utility.get_similarity(abcd, H_aff)¶
Getting the similarity matrix from a set of corners measured from
What is ABCD?
C +-------+ D | | | | | | A +-------+ B
- Parameters
abcd (dict) – the measured coordinates of chessboard corners
H_aff (numpy.ndarray) – affinity that makes coordinates affinely recitified
- Returns
the similar transformation matrix
- Return type
numpy.ndarray
- fish_3d.utility.get_temporal_overlapped_pairs(batch_1, batch_2, lag, ntol, rtol, unique='conn')¶
Get pairs that link overlapped trajectories from two batches. The temporal size of trajectories in both batches should be the same.
- Parameters
batch_1 – (list [ ( numpy.ndarray, float ) ]): each trajectory is (positions, error), time is [t0, t0 + size]
batch_2 – (list [ ( numpy.ndarray, float ) ]): each trajectory is (positions, error), time is [t0 + lag, t0 + size + lag]
lag (int) – The temporal lag between two batches
ntol (int) – if two trajectories have more numbers of overlapped positions than ntol, establish a link
rtol (float) – positions whose distance is smaller than rtol are considered to be overlapped.
- Returns
indices of a pair of overlapped trajectories, shape (n, 2)
- Return type
numpy.ndarray
- fish_3d.utility.get_trajectory_batches(cameras, features_mv_mt, st_error_tol, search_range, tau, z_min, z_max, overlap_num, overlap_rtol, reproj_err_tol, t1=1)¶
Getting short 3D trajectories batches from 2D positions and camera informations A batch is trajectories start from t0 to t0 + tau It is designed so that the same object will overlap in different batches, and they can be joined with function
resolve_temporal_overlap
───────────▶ (trajectory in batch 1) ───────────▶ (trajectory in batch 2) ───────────▶ (trajectory in batch 3)
- Parameters
cameras (Camera) – cameras for 3 views
freatrues_mv_mt (
list
) – 2d features in different views at different framesst_error_tol (
float
) – the stereo reprojection error cut-off for stereo linkingtau (
int
) – the length of trajectories in each batch; unit: frame the overlap between different batches will be tau // 2z_min (
float
) – the minimum allowed z-positions for all trajectoriesz_max (
float
) – the maximum allowed z-positions for all trajectoriesoverlap_num (
int
) – if two trajectories have more numbers of overlapped positions than overlap_num, establish a link.overlap_rtol (
float
) – if two trajectories have more numbers of overlapped positions, link them.reproj_err_tol (
float
) – the 3d positiosn whose reprojection error is greater than this will not be re-constructed, instead a NAN is inserted into the trajectory
- Returns
trajectories in different batches
- Return type
list
[list
[ (numpy.ndarray
,float
) ] ]
- fish_3d.utility.get_updated_camera(camera, R, T)¶
Get the updated camera with new rotation and translation.
- Parameters
R (numpy.ndarray) – the new rotation vector \(\in so(3)\)
T (numpy.ndarray) – the new translation vector \(\in \mathbb{R}^3\)
- Returns
a new Camera instance whose extrinsic parameters were updated.
- Return type
- fish_3d.utility.get_valid_ctraj(trajectories, z_min, z_max)¶
Extract valid trajectories (inside box boundary) from raw trajectories
- Parameters
trajectories (list [ ( numpy.ndarray, float ) ]) – a collection of trajectories, each trajectory is (positions , error)
z_min (float) – the minimum allowed z-coordinate of each trajectory corresponding to the bottom of the boundary.
z_max (float) – the maximum allowed z-coordinate of each trajectory corresponding to the top of the boundary.
- Returns
valid trajectories
- Return type
list [ ( numpy.ndarray, float ) ]
- fish_3d.utility.interpolate_nan(coordinates)¶
replace nan with linear interpolation of a (n, 3) array along the first axis
- Parameters
coordinates (numpy.ndarray) – xyz coordinates of a trajectory, might contain nan
is_nan (numpy.ndarray) – 1d bolean array showing if coordinates[i] is nan or not
- Returns
the interpolated coordinates array
- Return type
numpy.ndarray
Example
>>> target = np.array([np.arange(100)] * 3).T.astype(float) >>> target.shape (100, 3) >>> with_nan = target.copy() >>> nan_idx = np.random.randint(0, 100, 50) >>> with_nan[nan_idx] = np.nan >>> with_nan[0] = 0 >>> with_nan[-1] = 99 >>> np.allclose(target, interpolate_nan(with_nan)) True
- fish_3d.utility.optimise_c2c(R, T, K, C, p2d, p3dh, method='Nelder-Mead')¶
Optimise the camera extrinsic parameters by measuring the ellipse (conic) projected by a circle at plane \(\pi_z = (0, 0, 1, 0)^T\).
- Parameters
R (numpy.ndarray) – the rotation vector \(\in so(3)\)
T (numpy.ndarray) – the translation vector \(\in \mathbb{R}^3\)
K (numpy.ndarray) – the calibration matrix \(\in \mathbb{R}^{3 \times 3}\)
C (numpy.ndarray) – the matrix representation of a conic, shape (3, 3). The conic is a projection of the circle.
p2d (numpy.ndarray) – the undistorted 2d points for the PnP problem, shape (n, 2)
p3dh (numpy.ndarray) – the homogeneous representations of 3d points for the PnP problem, shape (n, 3)
- Returns
the optimised \(\mathbf{R} \in so(3)\) and \(\mathbf{T} \in \mathbb{R}^3\)
- Return type
- fish_3d.utility.optimise_triplet_c2c(R1, T1, R2, T2, R3, T3, K1, K2, K3, C1, C2, C3, p2d1, p2d2, p2d3, p3dh)¶
Optimise three camera extrinsic parameters by measuring the ellipse (conic) projected by a circle at plane \(\pi_z = (0, 0, 1, 0)^T\).
- Parameters
R1 (numpy.ndarray) – the rotation vector \(\in so(3)\) of camera 1
T1 (numpy.ndarray) – the translation vector \(\in \mathbb{R}^3\) of camera 1
R2 (numpy.ndarray) – the rotation vector \(\in so(3)\) of camera 2
T2 (numpy.ndarray) – the translation vector \(\in \mathbb{R}^3\) of camera 2
R3 (numpy.ndarray) – the rotation vector \(\in so(3)\) of camera 3
T3 (numpy.ndarray) – the translation vector \(\in \mathbb{R}^3\) of camera 3
K1 (numpy.ndarray) – the calibration matrix \(\in \mathbb{R}^{3 \times 3}\) of camera 1
K2 (numpy.ndarray) – the calibration matrix \(\in \mathbb{R}^{3 \times 3}\) of camera 2
K3 (numpy.ndarray) – the calibration matrix \(\in \mathbb{R}^{3 \times 3}\) of camera 3
C1 (numpy.ndarray) – the conic matrix from camera 1, shape (3, 3)
C2 (numpy.ndarray) – the conic matrix from camera 2, shape (3, 3)
C3 (numpy.ndarray) – the conic matrix from camera 3, shape (3, 3)
p2d1 (numpy.ndarray) – the undistorted 2d features for the PnP problem from camera 1, shape (n, 2)
p2d2 (numpy.ndarray) – the undistorted 2d features for the PnP problem from camera 2, shape (n, 2)
p2d3 (numpy.ndarray) – the undistorted 2d features for the PnP problem from camera 3, shape (n, 2)
p3dh (numpy.ndarray) – the homogeneous representations of 3d points for the PnP problem, shape (n, 3)
- Returns
the optimised \(\mathbf{R} \in so(3)\) and \(\mathbf{T} \in \mathbb{R}^3\)
- Return type
- fish_3d.utility.plot_cameras(ax, cameras, water_level=0, depth=400)¶
Draw cameras on the matplotlib Axes instance with water.
This is typically designed to represent the experiment in my PhD.
- fish_3d.utility.plot_epl(line: List[float], image: numpy.ndarray)¶
Get the scatters of an epipolar line inside an image The images is considered as being stored in (row [y], column [x])
- fish_3d.utility.plot_reproject(image, features, pos_3d, camera, filename=None, water_level=0, normal=(0, 0, 1))¶
- Parameters
pos_3d (np.ndarray) – 3d positions, shape (n, 3)
- fish_3d.utility.plot_reproject_with_roi(image, roi, features, pos_3d, camera, filename=None, water_level=0, normal=(0, 0, 1))¶
- fish_3d.utility.polar_chop(image, H_sim, centre, radius, n_angle, n_radius, dist_coef, k)¶
Chop an image in the polar coordinates return the chopped result as a labelled image
- Parameters
image (numpy.ndarray) – 2d image as a numpy array
H_sim (numpy.ndarray) – a homography (3 x 3 matrix) to similarly rectify the image
centre (numpy.ndarray) – origin of the polar coordinate system
radius (int) – maximum radius in the polar coordinate system
n_angle (int) – number of bins in terms of angle
n_radius (int) – number of bins in terms of radius
dist_coef (numpy.ndarray) – distortion coefficients of the camera , shape (5, ), k1, k2, p1, p2, k3 (from opencv by default)
k – (numpy.ndarray) camera calibration matrix (bible, P155)
- Returns
labelled image where each chopped regions were labelled with different values
- Return type
numpy.array
- fish_3d.utility.post_process_ctraj(trajs_3d, t0, z_min, z_max, num=5, rtol=10)¶
Refining the trajectories obtained from cgreta, following three steps
Removing trajectories that is outside the boundary (fish tank).
Removing trajectories that overlaps. Overlapping means for 2 trajectories, there are more than
num
positions whose distance is belowrtol
Convert the format of the trajectory, from (position, error) to (time, position)
- Parameters
trajs_3d (list [ ( numpy.ndarray, float ) ]) – a collection of trajectories, each trajectory is (positions , error)
t0 (int) – the starting frame of these trajectories.
z_min (float) – the minimum allowed z-coordinate of each trajectory corresponding to the bottom of the boundary.
z_max (float) – the maximum allowed z-coordinate of each trajectory corresponding to the top of the boundary.
num (int) – the maximum number of allowed overlapped positions.
- Returns
a collection of refined trajectories, represented as (time, position)
- Return type
list [ (numpy.ndarray, numpy.ndarray) ]
- fish_3d.utility.refine_trajectory(trajectory, cameras, features, tol_2d)¶
- Refine the trajectory so that each reprojected position
matches the features detected in different cameras.
- The purpose of the function is to optimise the linearly
interpolated trajectories.
- Parameters
trajectory (numpy.ndarray) – a fully interpolated trajectory, shape (n_frame, 3).
cameras (list) – a collections of cameras.
features (list) – the positions of 2D features from different cameras, shape of element: (n_frame, 2).
tol_2d (float) – the tolerance for 2D reprojection errors. The very problematic 3D points will be replaced by the origional points in the trajectory.
- Returns
- a optimised trajectory, where 3D locations
are re-calculated from the 2D features.
- Return type
- fish_3d.utility.remove_spatial_overlap(trajectories, ntol, rtol)¶
If two trajectories were overlap in the space, choose the one with minimum reprojection error.
- Parameters
trajectories (list of (numpy.ndarray, float)) – a collection of trajectories, each trajectory is (positions, reprojection_error)
ntol (int) – if two trajectories have more numbers of overlapped positions than ntol, establish a link.
rtol (float) – if two trajectories have more numbers of overlapped positions, choose the one with smaller reprojection error.
- Returns
trajectories without spatial overlap
- Return type
list of (numpy.ndarray, float)
- fish_3d.utility.resolve_temporal_overlap(trajectory_batches, lag, ntol, rtol)¶
For trajectorie in many batches, extend them if they were overlapped.
For instance
INPUT: | lag | ───────────▶ (trajectory in batch 1) ───────────▶ (trajectory in batch 2) ───────────▶ (trajectory in batch 3) OUTPUT: ───────────────────────▶ (trajectory in result)
- Parameters
trajectory_batches – (list [ list [ (numpy.ndarray, float) ] ]): trajectories in different batches
lag (int) – the overlapped time between trajectories in two successive batches. TODO: fix the bug when lag is odd number
ntol (int) – if two trajectories have more numbers of overlapped positions than ntol, merge the two.
rtol (float) – positions whose distance is smaller than rtol are considered to be overlapped.
- Returns
a collection of resolved trajectories
- Return type
list [ (numpy.ndarray, float) ]
- fish_3d.utility.see_corners(image_file, corner_number=(23, 15))¶
- fish_3d.utility.solve_overlap_lp(points, errors, diameter)¶
Remove overlapped particles using linear programming, following
minimize the total error
particles do not overlap
retain most particles
- Parameters
points (np.ndaray) – particle locations, shape (N, dimension)
errors (np.ndarray) – the error (cost) of each particle, shape (N, )
diameter (float) – the minimium distance between non-overlap particles
- Returns
the optimised positions, shape (N’, dimension)
- Return type
np.ndarray
- fish_3d.utility.update_orientation(orientations, locations, H, length=10)¶
Calculate the orientatin after applying homography H This is function is used to get a ‘recitified’ orientation
- Parameters
orientation (numpy.ndarray) – angles of the fish, sin(angle) -> x very sadly
locatons (numpy.ndarray) – xy positons of fish in the image, not row-col
H (numpy.ndarray) – the homography matrix
length (int) – length of the orientation bar
- Returns
the recitified orientations
- Return type
numpy.ndarray