Академический Документы
Профессиональный Документы
Культура Документы
HTTPS://SITES.GOOGLE.COM/SITE/JOURNALOFCOMPUTING/
WWW.JOURNALOFCOMPUTING.ORG 38
Abstract— This Paper describes a system for automating butter churning process by utilizing digital signal processing techniques based on
the the sound of the churning process. Butter churning is a process to extract butter from the whole milk by mechanical motion. To the best
of our knowledge we are the first who thinks of automating butter churning process based on churning acoustic signature. The argument of
this paper is that sound of the churning varies according to the phase of the whole process. The churning process is divided in this paper into
three phases. The first, is the churning phase, the second is the butter-begin phase, where the butter start coming, and the last phase is the
butter collection phase, where the butter grains is gathered and the churning process ends. This paper characterizes the sound of each
churning phase. A feature vector is extracted from each phase sound based on spectrum distribution. Artificial neural network is used in this
paper as a classifier. Results show that the sound of each phase can be used to characterize the phases of the churning process. The shushing
sound of the butter grains motion is recognized in this paper and can be utilized to automate the churning process. This paper also describes
the design and in implementation of the system using dsPIC digital signal controller.
Index Terms— Butter Churning, Acoustic Signal, Classification, Neural Network, dsPIC.
—————————— ——————————
1 INTRODUCTION
ter churn, mainly less than 500 Hz. Sounds of butter terizes each churning phase sound. The median is used in this paper
churning phases are recorded at a rate of 11025 Hz. After because of the noisy environments. Fig.1 displays the acoustic
that the sound signal is preprocessed as in the following signals of the three phases of the churning process, while Fig. 2
sections. Shows the spectrogram of the sounds. It also shows the peaks of
the frequencies, it is quite clear that most of the information is in
2.1 Time Processing Stage the low frequency band. For the Unknown utterance, the same
DC bias removing by subtracting the mean from the time steps are done, except one frame of FFT is considered as the fea-
series sounds ture to be classified to reduce the cost of computation, because this
FFT computation is performed on-line. This can be extended to
xi n = xi n
x n
i (1)
have multiple frames, but this increases the cost of computation.
N
2 ARTIFICIAL NEURAL NETWORK (ANN)
Artificial neural network is a learning intelligent tool used to
solve problem that hard to be modeled analytically. A key feature
of neural networks is the ability to learn from examples of in-
put/ output pairs in a supervised fashion. Artificial neural
networks have been used as a technique for classification.
In our case, there are three classes: class one is the churn
sound, which is the sound of the motion of the cream parti-
cles, the second class is the sound of the coming of the but-
ter, and the third class is the sound of the butter collection,
which is the shushing sound. Multiple sounds are recorded
for each class, and then the 32-length feature vectors are
extracted from all of these sounds. These sets of feature
vectors for all classes are used to train the neural network.
An input of 400 samples of 32 features is used to train, val-
idate, and test the neural network. The targets are three
represent the three phases of the churning process. The 400
samples are randomly divided to 70%, 15%, and 15% for
training, validation, and test respectively. So the network is
trained with 280 samples, validated with 60 samples, and
tested with 60 samples. A three layer network with 10 hid-
den neurons is trained using scaled conjugate gradient
backpropagation. The performance of the neural network is
Fig. 1. Time domain of the three phases sounds shown in Tables 1. The overall correct classification rate is
shown in the last column of the table, which is 98.3%. This
indicates that the feature extraction method is efficient and
2.2 SPECTRUM ANALYSIS can be used to track the churning process phases.
X i W = FFT xi n (2)
X i W
X i W =
X i W (3)
Fig. 4. System Layout
After that, the spectrum magnitude is normalized for every frame
as in equ. 3. Where K is the window size. The median of all frames 4 SYSTEM FLOWCHART
is considered as the extracted feature vector.
X if W = median X i W (4) Digital signal controller is used in this system as in Fig. 4
to process the sounds to extract the feature vector and
The first 32 points of the median of the spectrum magnitude con- classify it to one of the three classes. Fig. 5 clarifies the
tain up to 300 Hz. This gives a 32 dimensional vector that charac- principle of work of the system. The system starts work-
ing when the butter churning machine starts working.
Firstly it initializes and starts audio codec driver and flash TABLE 1
memory driver in the dsPIC strater kit as in Fig. 4. After CORRECT CLASSIFICATION RATE
that the system starts recording the butter churning Training Validation Test
sound then extract feature for the sound by using FFT. Correct Clas-
FFT computation is implemented by calling “FFTCom- sification 99.50% 98.30% 93.30%
plexIP ( )” function from Microchip signal processing li- Rate
brary. Then ANN is used to classify the recorded sounds of the the three phases sounds in Fig. 3.
to one the followings classes:
1. Butter starts class.
2. Butter churns class. 6 CONCLUSION
3. Butter collection class. This paper proposed a new method to determine the
If the sound is classified to butter start class, the record physical ripening time of the butter churning process
and classify process continues. If the sound is classified to
based on the acoustic emission. Digital signal pro-
butter churn class record will also continue and a LED
cessing techniques are used to extract features from the
will be set. If the sound is classified to butter collection
class an alarm will sound for class 3 and stop the churn- sounds of the churning phases. The churning sounds
ing process. are categorized in this paper to three categories, the
first is the churn, second is the butter-begin, and the
5 RESULTS AND DISCUSSION third is the butter-collection phase. Each sound is con-
sidered as a class and artificial neural network is used
Results in this paper are based on real life butter
as a classifier. An average correct classification rate of
churning sounds that are recorded in Agricultural Re-
search Station in Mu'tah University. A microphone is 98% is obtained. Acoustic emissions are efficient fea-
placed on an electrical butter churn and record the tures that can be used for butter churning automating
whole process for several times. Matlab is used to ana- process.
lyze the sounds. After analysis, the whole process is
divided into three phase: phase 1 is the churn sound, ACKNOWLEDGMENT
phase 2 is the butter-coming phase, and phase three is
All thanks to Agricultural Research Station in Mu'tah
the collection of the butter and the end of the whole
University for providing facilities to record the sounds of
process. Fast Fourier Transform (FFT) is used for fea-
the butter churning process.
ture extraction. Fig. 3 shows that each phase sound can
be characterized by its own feature vector. This means
that the frequency components magnitudes vary from
phase to phase. This variation is due to the change of
REFERENCES
[1] http://www.positron.org/food/butter/
[2] http://www.webexhibits.org/butter/
[3] Aljaafreh A, Al-Fuqaha A. Multi-Target Classification Using
Acoustic Signatures in Wireless Sensor Networks: A survey.
Signal Processing: An International Journal (SPIJ), Volume (4):
Issue (4), 2010.
[4] Aljaafreh A, Dong L. An evaluation of feature extraction meth-
ods for vehicle classification based on acoustic signals. Proc.
IEEE International Conference ON Networking, Sensing and
Control (ICNSC), Apr. 2010.
[5] Funahashi H., Horiuchi J. Characteristics of the churning pro-
cess in continuous butter manufacture and modelling using an
artificial neutral network. International Dairy Journal, 2008.