Application Development Process of Soft Face Recognition in Rainbow

Keywords: network Gradle

Share a wave of dry goods with you at night.

Rainbow Soft Face Recognition is developed both in applications and offline applications. Because it does not require a network, it is faster to recognize.Okay, let's not talk much nonsense, so let's start to teach you how to use it.

1. First, go to the official website to apply for APPKEY, various keys, and then download the jar package, which will not be explained to you one by one.Note that adding this sentence to the app gradle may cause the so library to fail to load.

sourceSets {   
     main {         
   jniLibs.srcDirs = ['libs']       
   }    
}

2. Next, development is needed.For example, face detection needs to initialize the engine first.

 AFD_FSDKEngine engine1 = new AFD_FSDKEngine();   AFD_FSDKError err = engine1.AFD_FSDK_InitialFaceEngine(Config.APP_ID, Config.FD_KEY, AFD_FSDKEngine.AFD_OPF_0_HIGHER_EXT, 16, 5);

We also need a collection to store our detected faces.

List<AFD_FSDKFace> result = new ArrayList<AFD_FSDKFace>();//Create a new AFD_FSDKFacejihe to store the recognized face information

Next we can do face detection, but the selection and formatting of photos are required, so we need to format the photos.

Bitmap bitmap1 = decodeImage(path1);//Path is the path of the photo, first select the photo and convert it to bitmap   
byte[] data1 = getNv21(bitmap1);//Convert bitmap to NV21 format

The following is the code for the tool classes decodeImage and getNv21:

//getNv21 and decodeImage are Photo format conversion tools    
public byte[] getNv21(Bitmap mBitmap) {       
         byte[] data = new byte[mBitmap.getWidth() * mBitmap.getHeight() * 3 / 2];        
         ImageConverter convert = new  ImageConverter();        
         convert.initial(mBitmap.getWidth(), mBitmap.getHeight(), ImageConverter.CP_PAF_NV21);      
         if (convert.convert(mBitmap, data)) {     
                Log.e("TAG", "convert ok!");     
         }   
           convert.destroy();     
        return data;   
   }  
  public static Bitmap decodeImage(String path) {    
                  Bitmap res;      
                  try {     
                        ExifInterface exif = new ExifInterface(path);          
                        int orientation = exif.getAttributeInt(ExifInterface.TAG_ORIENTATION, ExifInterface.ORIENTATION_NORMAL);        
                        BitmapFactory.Options op = new BitmapFactory.Options();    
                        op.inSampleSize = 1;      
                        op.inJustDecodeBounds = false;            //op.inMutable = true;          
                        res = BitmapFactory.decodeFile(path, op);            //rotate and scale.         
                       Matrix matrix = new Matrix();        
                       if (orientation == ExifInterface.ORIENTATION_ROTATE_90) {            
                                matrix.postRotate(90);     
                         } else if (orientation == ExifInterface.ORIENTATION_ROTATE_180) {        
                                matrix.postRotate(180);       
                        } else if (orientation == ExifInterface.ORIENTATION_ROTATE_270) {       
                                matrix.postRotate(270);           
                         }    
                Bitmap temp = Bitmap.createBitmap(res, 0, 0, res.getWidth(), res.getHeight(), matrix, true);          
                Log.d("com.arcsoft", "check target Image:" + temp.getWidth() + "X" + temp.getHeight());      
               if (!temp.equals(res)) {         
                     res.recycle();  
                     }  
                   return temp;       
                    } catch (Exception e) {     
                         e.printStackTrace(); 
                     }   
                 return null;  
  }

After the format conversion is complete, face detection starts.

err = engine1.AFD_FSDK_StillImageFaceDetection(data1, bitmap1.getWidth(), bitmap1.getHeight(), AFD_FSDKEngine.CP_PAF_NV21, result);        Log.e("TAG", "getBit: " + result.size());

We can look at the size of the set result to determine if a face is detected. At the end of the code, the initialized engine must be destroyed.Otherwise, the program will crash due to memory problems. engine1.AFD_FSDK_UninitialFaceEngine();

Face comparison is based on face detection. In a photo, the face information is detected first, and then the face information is compared.
List result = new ArrayList(); As described above, the detected face information is stored in the result collection. Then, two classes are created to store facial point information

AFR_FSDKFace face1 = new AFR_FSDKFace();
AFR_FSDKFace face2 = new AFR_FSDKFace(); Store point information of detected face information in face In class        
 //Create two new AFR_FSDKFace classes to store facial feature information                 
 AFR_FSDKFace face1 = new AFR_FSDKFace();                
  AFR_FSDKFace face2 = new AFR_FSDKFace();                //Detection of face feature information               
       er = engine_camera.AFR_FSDK_ExtractFRFeature(data_image, 
                                                                                     bitmap_idcard.getWidth(),
                                                                                     bitmap_idcard.getHeight(),      
                                                                                     AFR_FSDKEngine.CP_PAF_NV21,      
                                                                                     new Rect(result_image.get(0).getRect()), 
                                                                                     result_image.get(0).getDegree(),
                                                                                     face1);           
     er = engine_camera.AFR_FSDK_ExtractFRFeature(data, 
                                                                                     wid, 
                                                                                      hei, 
                                                                                     AFR_FSDKEngine.CP_PAF_NV21, 
                                                                                     new Rect(result_fd.get(0).getRect()),
                                                                                     result_fd.get(0).getDegree(),
                                                                                     face2);

The similarity information for the final comparison is stored in the scope, float score_face = score.getScore();

In this way we can get the similarity information we want, and the final data is float type.

*Attention! When using photos, it is best to have an even resolution or an unknown error will occur. In face information extraction, time-consuming and time-consuming are based on the CPU processing power of the device.

Posted by e39m5 on Wed, 15 May 2019 18:13:04 -0700