-
Interfaces Interface Description edu.wpi.first.wpilibj.interfaces.Accelerometer This interface is being removed with no replacement.edu.wpi.first.wpilibj.interfaces.Gyro This interface is being removed with no replacement.
-
Classes Class Description edu.wpi.first.util.InterpolatingTreeMap UseInterpolatingDoubleTreeMap
insteadedu.wpi.first.wpilibj2.command.CommandBase
-
Fields Field Description edu.wpi.first.networktables.NetworkTableEntry.kPersistent Use isPersistent() instead.org.opencv.core.CvType.CV_USRTYPE1 please useCvType.CV_16F
-
Methods Method Description edu.wpi.first.math.Matrix.mat(Nat<R>, Nat<C>) edu.wpi.first.math.trajectory.TrapezoidProfile.calculate(double) Pass the desired and current state into calculate instead of constructing a new TrapezoidProfile with the desired and current stateedu.wpi.first.networktables.NetworkTableEntry.clearFlags(int) Use setPersistent() or topic properties insteadedu.wpi.first.networktables.NetworkTableEntry.delete() Use unpublish() instead.edu.wpi.first.networktables.NetworkTableEntry.getFlags() Use isPersistent() or topic properties insteadedu.wpi.first.networktables.NetworkTableEntry.setFlags(int) Use setPersistent() or topic properties insteadedu.wpi.first.wpilibj.Encoder.getPeriod() Use getRate() in favor of this method.edu.wpi.first.wpilibj.Encoder.setMaxPeriod(double) Use setMinRate() in favor of this method. This takes unscaled periods and setMinRate() scales using value from setDistancePerPulse().edu.wpi.first.wpilibj.Notifier.setHandler(Runnable) Use setCallback() instead.org.opencv.aruco.Aruco.detectCharucoDiamond(Mat, List<Mat>, Mat, float, List<Mat>, Mat, Mat, Mat, Dictionary) Use CharucoDetector::detectDiamondsorg.opencv.aruco.Aruco.detectMarkers(Mat, Dictionary, List<Mat>, Mat, DetectorParameters, List<Mat>) Use class ArucoDetector::detectMarkersorg.opencv.aruco.Aruco.drawPlanarBoard(Board, Size, Mat, int, int) Use Board::generateImageorg.opencv.aruco.Aruco.estimatePoseBoard(List<Mat>, Mat, Board, Mat, Mat, Mat, Mat, boolean) Use cv::solvePnPorg.opencv.aruco.Aruco.estimatePoseSingleMarkers(List<Mat>, float, Mat, Mat, Mat, Mat, Mat, EstimateParameters) Use cv::solvePnPorg.opencv.aruco.Aruco.getBoardObjectAndImagePoints(Board, List<Mat>, Mat, Mat, Mat) Use Board::matchImagePointsorg.opencv.aruco.Aruco.interpolateCornersCharuco(List<Mat>, Mat, Mat, CharucoBoard, Mat, Mat, Mat, Mat, int) Use CharucoDetector::detectBoardorg.opencv.aruco.Aruco.refineDetectedMarkers(Mat, Board, List<Mat>, Mat, List<Mat>, Mat, Mat, float, float, boolean, Mat, DetectorParameters) Use class ArucoDetector::refineDetectedMarkersorg.opencv.aruco.Aruco.testCharucoCornersCollinear(CharucoBoard, Mat) Use CharucoBoard::checkCharucoCornersCollinearorg.opencv.core.Core.getThreadNum() Current implementation doesn't corresponding to this documentation. The exact meaning of the return value depends on the threading framework used by OpenCV library:-
TBB
- Unsupported with current 4.1 TBB release. Maybe will be supported in future. -
OpenMP
- The thread number, within the current team, of the calling thread. -
Concurrency
- An ID for the virtual processor that the current context is executing on (0 for master thread and unique number for others, but not necessary 1,2,3,...). -
GCD
- System calling thread's ID. Never returns 0 inside parallel region. -
C=
- The index of the current parallel task. SEE: setNumThreads, getNumThreads
org.opencv.dnn.Dnn.getInferenceEngineBackendType() org.opencv.dnn.Dnn.setInferenceEngineBackendType(String) org.opencv.dnn.Layer.run(List<Mat>, List<Mat>, List<Mat>) This method will be removed in the future release.org.opencv.dnn.Net.getLayer(String) Use int getLayerId(const String &layer)org.opencv.imgproc.Imgproc.linearPolar(Mat, Mat, Point, double, int) This function produces same result as cv::warpPolar(src, dst, src.size(), center, maxRadius, flags) Transform the source image using the following transformation (See REF: polar_remaps_reference_image "Polar remaps reference image c)"): \(\begin{array}{l} dst( \rho , \phi ) = src(x,y) \\ dst.size() \leftarrow src.size() \end{array}\) where \(\begin{array}{l} I = (dx,dy) = (x - center.x,y - center.y) \\ \rho = Kmag \cdot \texttt{magnitude} (I) ,\\ \phi = angle \cdot \texttt{angle} (I) \end{array}\) and \(\begin{array}{l} Kx = src.cols / maxRadius \\ Ky = src.rows / 2\Pi \end{array}\)org.opencv.imgproc.Imgproc.logPolar(Mat, Mat, Point, double, int) This function produces same result as cv::warpPolar(src, dst, src.size(), center, maxRadius, flags+WARP_POLAR_LOG); Transform the source image using the following transformation (See REF: polar_remaps_reference_image "Polar remaps reference image d)"): \(\begin{array}{l} dst( \rho , \phi ) = src(x,y) \\ dst.size() \leftarrow src.size() \end{array}\) where \(\begin{array}{l} I = (dx,dy) = (x - center.x,y - center.y) \\ \rho = M \cdot log_e(\texttt{magnitude} (I)) ,\\ \phi = Kangle \cdot \texttt{angle} (I) \\ \end{array}\) and \(\begin{array}{l} M = src.cols / log_e(maxRadius) \\ Kangle = src.rows / 2\Pi \\ \end{array}\) The function emulates the human "foveal" vision and can be used for fast scale and rotation-invariant template matching, for object tracking and so forth. -
-
Constructors Constructor Description edu.wpi.first.math.MatBuilder(Nat<R>, Nat<C>) edu.wpi.first.math.trajectory.TrapezoidProfile(TrapezoidProfile.Constraints, TrapezoidProfile.State, TrapezoidProfile.State) Pass the desired and current state into calculate instead of constructing a new TrapezoidProfile with the desired and current stateedu.wpi.first.wpilibj2.command.TrapezoidProfileCommand(TrapezoidProfile, Consumer<TrapezoidProfile.State>, Subsystem...) The new constructor allows you to pass in a supplier for desired and current state. This allows you to change goals at runtime.