In recent years,machine learning has achieved great success in many fields. In the real scene of the real world, due to thecomplex collection environment or difficult annotation,there are?few standard samples available for training,resulting in the problem ofover learning or under learning of machine learning models. Therefore,few-shot learning is a challenging machine learning problem. Recently,people have proposed a method of distribution calibration,which assume that each dimension in the feature representation obeys aGaussian distribution, and?
use the feature distribution of the base classes to calibrate the feature distribution of the novel classes.However,this method is easy to introduce the phenomenon of negative migration,
and it is easy to submerge the characteristic distributionof the novel classes itself. Therefore,we propose the dynamic distribution calibration to solve the problem of negative migration in thedistribution correction method. Firstly, the base classes of the nearest neighbor and the base classes of the far domain based on thethreshold value are selected dynamically. Secondly,
the standardization processing is added to the sample characteristics of the novelclasses after power transformation to eliminate the differences between different dimensions. Finally,the method introduces parameters toadjust the proportional relationship between the migration distribution and the original distribution of the features of the novel classes to achieve the calibration of the feature distribution of the novel classes. A large number of comparative experiments with the latest algorithmand the traditional algorithm on the conventional data sets miniImageNet and CUB show that the proposed method can effectively improvethe performance of few-shot classification tasks.