如何调查随机Android原生函数调用错误?

Posted

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了如何调查随机Android原生函数调用错误?相关的知识,希望对你有一定的参考价值。

首先,我很抱歉问题的标题,我面临一个问题,我找不到任何关于我得到的日志错误。

我正在开发一个使用Opencv进行图像处理和匹配的android应用程序。

主要的Opencv代码使用c ++完成,并使用JNI函数导出到java。

申请过程是这样的,首先我拍摄相机图像,然后打开Opencv相机并开始将每个帧与参考图像匹配。

在每个帧上,我调用一个名为detectFeatures的本机方法,它返回一个double,表示匹配的两个图像的百分比。

有时应用程序工作,有时它崩溃,我得到这个错误是日志

/data/app/com.grimg.coffretpfe-2/lib/arm/libnative-lib.so (_Z6toGrayN2cv3MatES0_+1577)
/data/app/com.grimg.coffretpfe-2/lib/arm/libnative-lib.so (Java_com_grimg_coffretpfe_Activities_CompareActivity_detectFeatures+100)

在c ++代码中,我有一个名为toGray的函数,这是它的签名

double toGray(Mat captured, Mat target)

我在java中调用的jni方法是

extern "C"
jdouble
JNICALL Java_com_grimg_coffretpfe_Activities_CompareActivity_detectFeatures(
    JNIEnv *env,
    jclass type, jlong addrRgba, jlong addrGray /* this */) {

Mat &mRgb = *(Mat *) addrRgba;
Mat &mGray = *(Mat *) addrGray;

jdouble retVal;

double conv = toGray(mRgb, mGray);


retVal = (jdouble) conv;

return retVal;

}

我搜索了很多关于这个错误,我找不到任何关于它的信息。

也许你们可以帮我解决这个问题。

编辑:

double toGray(Mat captured, Mat target) {
std::vector<cv::KeyPoint> keypointsCaptured;
std::vector<cv::KeyPoint> keypointsTarget;

cv::Mat descriptorsCaptured;
cv::Mat descriptorsTarget;
//cv::Mat captured;
std::vector<cv::DMatch> matches;
std::vector<cv::DMatch> symMatches;

//std::vector<std::vector<cv::DMatch> > matches1;
//std::vector<std::vector<cv::DMatch> > matches2;
//Mat captured, target;
/*captured = imread("/storage/emulated/0/data/img1.jpg", IMREAD_GRAYSCALE);
target = imread("/storage/emulated/0/data/img3.jpg", IMREAD_GRAYSCALE);
if (!captured.data) {
    // Print error message and quit
    __android_log_print(ANDROID_LOG_INFO, "sometag", "I cant do this.");

}
if (!target.data) {
    // Print error message and quit
    __android_log_print(ANDROID_LOG_INFO, "sometag", "cant do nuthin.");


}*/
//cvtColor(capturedR, captured, CV_RGBA2GRAY);
//cvtColor(targetR, target, CV_RGBA2GRAY);


orb = ORB::create();

//Pre-process
resize(captured, captured, Size(480, 360));
medianBlur(captured, captured, 5);

resize(target, target, Size(480, 360));
medianBlur(target, target, 5);

orb->detectAndCompute(captured, noArray(), keypointsCaptured, descriptorsCaptured);
orb->detectAndCompute(target, noArray(), keypointsTarget, descriptorsTarget);
//__android_log_print(ANDROID_LOG_INFO, "sometag", "keypoints2 size = %d", keypointsTarget.size());
//__android_log_print(ANDROID_LOG_INFO, "sometag", "keypoints size = %d", keypointsCaptured.size());

//Match images based on k nearest neighbour
std::vector<std::vector<cv::DMatch> > matches1;
matcher.knnMatch(descriptorsCaptured, descriptorsTarget,
                 matches1, 2);
//__android_log_print(ANDROID_LOG_INFO, "sometag", "Matches1 = %d",     matches1.size());
std::vector<std::vector<cv::DMatch> > matches2;
matcher.knnMatch(descriptorsTarget, descriptorsCaptured,
                 matches2, 2);
//Ratio filter
ratioTest(matches1);
ratioTest(matches2);
symmetryTest(matches1, matches2, symMatches);
ransacTest(symMatches,
           keypointsCaptured, keypointsTarget, matches);
const int symMatchCount = matches.size();

Point2f point1;
Point2f point2;
float median;
float meanBoy = 0;
float greatest = 0;
float lowest = 0;
int count = 0;
vector<float> angleList;
vector<Point2f> point1List;
vector<Point2f> point2List;

for (int i = 0; i < matches.size(); i++) {
    point1 = keypointsCaptured[matches[i].queryIdx].pt;
    point2 = keypointsTarget[matches[i].trainIdx].pt;
    point1List.push_back(point1);
    point2List.push_back(point2);

    deltaY = ((360 - point2.y) - (360 - point1.y));
    deltaX = (point2.x + 480 - point1.x);

    angle = atan2(deltaY, deltaX) * 180 / PI;
    cout << "ORB Matching Results" << angle << endl;
    //if (angle > greatest) greatest = angle;
    //if (angle < lowest) lowest = angle;
    meanBoy += angle;

    angleList.push_back(angle);
    //std::cout << "points " << "(" << point1.x << "," <<360-point1.y<<") (" << point2.x << ","<<360-point2.y<<") angle:" <<angle << std::endl;
    //std::cout << angle << std::endl;

}
// do something with the best points...

//std::cout << "Mean" << meanBoy/symMatchCount << std::endl;
vector<float> angleLCopy(angleList);
std::sort(angleLCopy.begin(), angleLCopy.end());
/*               if(angleList.size() % 2 == 0)
                         median = (angleList[angleList.size()/2 - 1] + angleList[angleList.size()/2]) / 2;
                 else
                         median = angleList[angleList.size()/2];
                */
size_t medianIndex = angleLCopy.size() / 2;
nth_element(angleLCopy.begin(), angleLCopy.begin() + medianIndex, angleLCopy.end());
median = angleLCopy[medianIndex];
std::cout << "new Median method " << angleLCopy[medianIndex] << std::endl;
//std::cout << "greatest " << greatest << "|| lowest "<< lowest << std::endl;

//std::cout << "No of matches by shehel: " << angleList[35] << " size " << symMatchCount << std::endl;
//std::cout << "Median" << median << std::endl;
//std::cout << matches.size()<< std::endl;
count = 0;
for (auto i = matches.begin(); i != matches.end();) {

    //std::cout << angleList.at(count)<< std::endl;

    //if (angle > greatest) greatest = angle;
    //if (angle < lowest) lowest = angle;
    point1 = point1List.at(count);
    point2 = point2List.at(count);

    deltaY = ((360 - point2.y) - (360 - point1.y));
    deltaX = ((point2.x + 480) - point1.x);

    angle = atan2(deltaY, deltaX) * 180 / PI;
    //angleList.push_back (angle);
    cout << "Is it sorted? " << angleList.at(count) << endl;

    if (angleList.at(count) > (median + 5) | angleList.at(count) < (median - 5)) {
        //cout << "bitch is gone" << angleList.at(count) << endl;
        matches.erase(i);
        count++;

    }
        //{i++; count++;}
    else {
        cout << "Points A (" << point1.x << ", " << point1.y << ") B (" <<
             point2.x + 480 << ", " << point2.y << ") Deltas of X" << deltaX << " Y " <<
             deltaY << "  Angle " << angle << endl;
        cout << "aint going no where" << angleList.at(count) << endl;

        ++i;
        count++;
        //if (angle>0.5 | angle < -0.7)
        //matches.erase(matches.begin()+i);
        // do something with the best points...
    }
}

return (static_cast<double>(matches.size()) / static_cast<double>(matches1.size()));
}

编辑2:

cv::Mat ransacTest(
  const std::vector<cv::DMatch>& matches,
  const std::vector<cv::KeyPoint>& keypoints1,
  const std::vector<cv::KeyPoint>& keypoints2,
  std::vector<cv::DMatch>& outMatches) {
  // Convert keypoints into Point2f
  std::vector<cv::Point2f> points1, points2;
  cv::Mat fundemental;
  for (std::vector<cv::DMatch>::
     const_iterator it= matches.begin();
   it!= matches.end(); ++it) {
   // Get the position of left keypoints
   float x= keypoints1[it->queryIdx].pt.x;
   float y= keypoints1[it->queryIdx].pt.y;
   points1.push_back(cv::Point2f(x,y));
   // Get the position of right keypoints
   x= keypoints2[it->trainIdx].pt.x;
   y= keypoints2[it->trainIdx].pt.y;
   points2.push_back(cv::Point2f(x,y));
}
 // Compute F matrix using RANSAC
  std::vector<uchar> inliers(points1.size(),0);
  if (points1.size()>0&&points2.size()>0){
     cv::Mat fundemental= cv::findFundamentalMat(
     cv::Mat(points1),cv::Mat(points2), // matching points
      inliers,       // match status (inlier or outlier)
      CV_FM_RANSAC, // RANSAC method
      distance,      // distance to epipolar line
      confidence); // confidence probability
  // extract the surviving (inliers) matches
  std::vector<uchar>::const_iterator
                     itIn= inliers.begin();
  std::vector<cv::DMatch>::const_iterator
                     itM= matches.begin();
  // for all matches
  for ( ;itIn!= inliers.end(); ++itIn, ++itM) {
     if (*itIn) { // it is a valid match
         outMatches.push_back(*itM);
      }
   }
   if (refineF) {
   // The F matrix will be recomputed with
   // all accepted matches
      // Convert keypoints into Point2f
      // for final F computation
      points1.clear();
      points2.clear();
      for (std::vector<cv::DMatch>::
             const_iterator it= outMatches.begin();
          it!= outMatches.end(); ++it) {
          // Get the position of left keypoints
          float x= keypoints1[it->queryIdx].pt.x;
          float y= keypoints1[it->queryIdx].pt.y;
          points1.push_back(cv::Point2f(x,y));
          // Get the position of right keypoints
          x= keypoints2[it->trainIdx].pt.x;
          y= keypoints2[it->trainIdx].pt.y;
          points2.push_back(cv::Point2f(x,y));
      }
      // Compute 8-point F from all accepted matches
      if (points1.size()>0&&points2.size()>0){
         fundemental= cv::findFundamentalMat(
            cv::Mat(points1),cv::Mat(points2), // matches
            CV_FM_8POINT); // 8-point method
      }
   }
}
return fundemental;

}

答案

当你在toGray函数中循环匹配时,我看到可能发生崩溃:

matches.erase(i);

这将使迭代器i无效。您应该将其替换为:

i = matches.erase(i);

以上是关于如何调查随机Android原生函数调用错误?的主要内容,如果未能解决你的问题,请参考以下文章

Android 逆向ART 脱壳 ( InMemoryDexClassLoader 脱壳 | BaseDexClassLoader 构造函数 | DexPathList 构造函数及后续调用 )(代码片

错误记录Oboe / AAudio 播放器报错 ( onEventFromServer - AAUDIO_SERVICE_EVENT_DISCONNECTED - FIFO cleared )(代码片

错误记录Android 中调用 Process 命令行执行指令 ( java.lang.IllegalThreadStateException: process hasn‘t exited )(代码片

Android 安装包优化使用 lib7zr.so 动态库处理压缩文件 ( jni 中 main 函数声明 | 命令行处理 | jni 调用 lib7zr.so 函数库处理压缩文件完整代码 )(代码片

如何从 Flutter web 调用原生 Android 代码。?

错误记录Android 应用中启动 FlutterActivity 报错 ( have you declared this activity in your AndroidManifest )(代码片