如何解决MFCC和adaboost分类器之间的尺寸错误?

我想提供用于火车的3450音频文件和用于测试的690音频文件的功能。我想将此数据分类为6类。 Y_train((3450,6))和Y_test((690,6))是火车和测试数据的标签。当我运行以下代码时,特征提取为(25,13)后,尺寸错误导致X_train。我该如何解决这个问题?

from pathlib import Path
import glob

import numpy as np
from keras.utils import np_utils
import scipy.io.wavfile as wavfile

from python_speech_features import mfcc
from sklearn.ensemble import AdaBoostClassifier
from sklearn.svm import SVC
from sklearn.metrics import accuracy_score


# Load Data train
path = Path('/home/narges/DataSet_981110/Vowel-DataSet/data-spilit/Data_train/').glob('**/*.wav')
wavs = [str(wavf) for wavf in path if wavf.is_file()]
wavs.sort()

number_of_files=len(wavs)
spk_ID = [wavs[i].split('/')[-1].lower() for i in range(number_of_files)]

label_no = [spk_ID[i].split('_')[-2] for i in range(number_of_files)]
Y_train = np_utils.to_categorical(label_no)

X_train = mfcc(sig,rate,winlen=0.06,winstep=0.01,numcep=13,nfilt=26,nfft=512,lowfreq=0,highfreq=rate/2,preemph=0.97,ceplifter=22,appendEnergy=True,winfunc=np.hamming)

# Load Data test
path_t = Path('/home/narges/DataSet_981110/Vowel-DataSet/data-spilit/Data_test/').glob('**/*.wav')
wavs_t = [str(wavf) for wavf in path_t if wavf.is_file()]
wavs_t.sort()

number_of_files_t=len(wavs_t)
spk_ID_t = [wavs_t[i].split('/')[-1].lower() for i in range(number_of_files_t)]

label_no_t = [spk_ID_t[i].split('_')[-2] for i in range(number_of_files_t)]
Y_test = np_utils.to_categorical(label_no_t)

for i in range(number_of_files_t):
    (rate_t,sig_t) = wavfile.read(wavs_t[i])

X_test = mfcc(sig_t,rate_t,winfunc=np.hamming)


# Create adaboost classifer object
adamodel =AdaBoostClassifier(n_estimators=50,base_estimator=SVC,learning_rate=1)

# Train Adaboost Classifer
model = adamodel.fit(wavs,Y_train.shape[0])

#Predict the response for test dataset
y_pred = model.predict(wavs_t)


# Model accuracy,how often is the classifier correct?
print("accuracy:",accuracy_score(Y_test,y_pred))
wodedark 回答:如何解决MFCC和adaboost分类器之间的尺寸错误?

暂时没有好的解决方案,如果你有好的解决方案,请发邮件至:iooj@foxmail.com
本文链接:https://www.f2er.com/2514948.html

大家都在问