Fingerprint images that are captured by optical readers usually cannot locate a fingerprint since the rotation and the translation are always together. Considering that both real-time and accurate requirements are need for live applications, this paper presents a novel approach to recognizing a fingerprint based on the core sub-region, which is the area of 100 × 100 pixels surrounding the core point. Log-polar mapping is used to extract the translation-invariant features derived from the discrete wavelet frame transform. Finally, a Bayesian likelihood ratio-based fitness function is devised to genetically select the most discriminative log-polar feature subset by disregarding redundant features via support vector machines classification. The classification results are given for real fingerprint data. Experimental results show that the proposed method can reject imposters efficiently and achieve an over 98% recognition rate operating with two frames/s processing speed. In comparison to the related works, the proposed system is more accurate than the conventional minutiae-based methods. (22 refs)