In this paper we present a novel framework for evolving ART-based classification models, which we refer to as MOME-ART. The new training framework aims to evolve populations of ART classifiers to optimize both their classification error and their structural complexity. Towards this end, it combines the use of interacting sub-populations, some traditional elements of genetic algorithms to evolve these populations and a simulated annealing process used for solution refinement to eventually give rise to a multi-objective, memetic evolutionary framework. In order to demonstrate its capabilities, we utilize the new framework to train populations of semi-supervised Fuzzy ARTMAP and compare them with similar networks trained via the recently published MO-GART framework, which has been shown as being very effective in yielding high-quality ART-based classifiers. The experimental results show clear advantages of MOME-ART in terms of Pareto Front quality and density, as well as parsimony properties of the resulting classifiers.