• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site
Of all publications in the section: 17
Sort:
by name
by year
Article
Babkin E., Satunin S. V. Expert Systems with Applications. 2014. Vol. 41. No. 15. P. 6622-6633.

Challenges of urbanization require new, more flexible approaches to design of public transportation systems. Demand Responsive Transport systems (DRT) that provide a share transportation services with flexible routes and focus on optimizing of economic and environmental value are becoming an important part of public transportation. In this paper we propose a new approach to design of DRT models which considers DRT as a multi-agent system (MAS) where various autonomous agents represent interests of system’s stakeholders. The distributed nature of the MAS facilitates design of scalable implementations in modern cloud environments. We also propose a planning algorithm based on combinatorial auctions (CA) that allows to express commodity of multiple transportation scenarios by evident means of the bids. Using the mechanism of CA we may fully take into account the presence of complementariness and substitutability among the items that differ across bidders. Further, we describe design principles of our proposed software with a prototype implementation. We believe that our approach to multi-agent modeling is general enough to provide the flexibility necessary for adoption of DRT-services modeling into real-world scenarios. The results of modeling have been compared against several cases of a local bus provider and validated in a set of computational experiments.

Added: Jan 19, 2015
Article
Fedorova Elena, Gilenko E., Dovzhenko S. Expert Systems with Applications. 2013. No. 40. P. 7285-7293.

Bankruptcy  forecasting

Added: Nov 7, 2014
Article
Lopez Iturriaga F. J., Sanz I. P. Expert Systems with Applications. 2015. Vol. 42. No. 6. P. 2857-2869.

We develop a model of neural networks to study the bankruptcy of U.S. banks, taking into account the specific features of the recent financial crisis. We combine multilayer perceptrons and self-organizing maps to provide a tool that displays the probability of distress up to three years before bankruptcy occurs. Based on data from the Federal Deposit Insurance Corporation between 2002 and 2012, our results show that failed banks are more concentrated in real estate loans and have more provisions. Their situation is partially due to risky expansion, which results in less equity and interest income. After drawing the profile of distressed banks, we develop a model to detect failures and a tool to assess bank risk in the short, medium and long term using bankruptcies that occurred from May 2012 to December 2013 in U.S. banks. The model can detect 96.15% of the failures in this period and outperforms traditional models of bankruptcy prediction.

Added: Dec 10, 2015
Article
Gromov V., Shulga A. N. Expert Systems with Applications. 2012. Vol. 39. No. 9. P. 8474-8478.

In this study, the novel method to predict chaotic time series is proposed. The method employs the ant colony optimization paradigm to analyze topological structure of the attractor behind the given time series and to single out the typical sequences corresponding to the different part of the attractor. The typical sequences are used to predict the time series values. The method was applied to time series generated by the Lorenz system, the Mackey–Glass equation, and weather time series as well. The method is able to provide robust prognosis to the periods comparable with the horizon of prediction.

Highlights

► Novel method based on ant colony optimization to predict chaotic series is proposed. ► The method allows forecast series for periods comparable with horizon of prognosis. ► The method was tested by standard benchmarks (Lorenz system, Mackey–Glass equation). ► The method allows extract parts of series bad and good for prognosis.

Added: Sep 27, 2018
Article
Iscan Z., Dokur Z., Demiralp T. Expert Systems with Applications. 2011. Vol. 38. No. 8. P. 10499-10505.
Added: Jan 21, 2015
Article
Oleg E. Bukharov, Dmitry P. Bogolyubov. Expert Systems with Applications. 2015. Vol. 42. No. 15-16. P. 6177-6183.

Given ever increasing information volume and complexity of engineering, social and economic systems, it has become more difficult to assess incoming data and manage such systems properly. Currently developed innovative decision support systems (DSS) aim to achieve optimum results while minimizing the risks of serious losses. The purpose of the DSS is to help the decision-maker facing the problem of huge amounts of data and ambiguous reactions of complicated systems depending on external factors. By means of accurate and profound analysis, DSSs are expected to provide the user with precisely forecasted indicators and optimal decisions.

In this paper we suggest a new DSS structure which could be used in a wide range of difficult to formalize tasks and achieve a high speed of calculation and decision-making.

We examine different approaches to determining the dependence of a target variable on input data and review the most common statistical forecasting methods. The advantages of using neural networks for this purpose are described. We suggest applying interval neural networks for calculations with underdetermined (interval) data, which makes it possible to use our DSS in a wide range of complicated tasks. We developed a corresponding learning algorithm for the interval neural networks. The advantages of using a genetic algorithm (GA) to select the most significant inputs are shown. We justify the use of general-purpose computing on graphics processing units (GPGPU) to achieve high-speed calculations with the decision support system in question. A functional diagram of the system is presented and described. The results and samples of the DSS application are demonstrated.

Added: May 17, 2015
Article
Zelenkov Y. Expert Systems with Applications. 2019. Vol. 135. P. 71-82.

Intelligent computer systems aim to help humans in making decisions. Many practical decision-making problems are classification problems in their nature, but standard classification algorithms often not applicable since they assume balanced distribution of classes and constant misclassification costs. From this point of view, algorithms that consider the cost of decisions are essential since they are more consistent with the requirements of real life. These algorithms generate decisions that directly optimize parameters valuable for business, for example, the costs savings. But despite on practical value of cost-sensitive algorithms, the little number of works study this problem concentrating mainly on the case when the cost of a classifier error is constant and does not depend on a specific example. However, many real-world classification tasks are example-dependent cost-sensitive (ECS), where the costs of misclassification vary between examples and not only within classes. Existing methods of ECS learning include just modifications of the simplest models of machine learning (naive Bayes, logistic regression, decision tree). These models produce promising results, but there is a need for further improvement in performance that can be achieved by using gradient-based ensemble methods. To break this gap, we present the ECS generalization of AdaBoost. We study three models which differ by the ways to introduce cost into the loss function: inside the exponent, outside the exponent, and both inside and outside the exponent. The results of the experiments on three synthetic and two real datasets (bank marketing and insurance fraud) show that example-dependent cost-sensitive modifications of AdaBoost outperform other known models. Empirical results also show that critical factors influencing the choice of the model are not only the distribution of features, which is typical for cost-insensitive and class-dependent cost-sensitive problems but also the distribution of costs. Next, since the outputs of AdaBoost are not well calibrated posterior probabilities, we check three approaches to calibration of classifier scores: Platt scaling, isotonic regression, and ROC modification. The results show that calibration not only significantly improves the performance of specific ECS models but allows making better capabilities of original AdaBoost. Obtained results provide new insight regarding the behavior of the cost-sensitive model from a theoretical point of view and prove that the presented approach can significantly improve the practical design of intelligent systems.

Added: Jun 14, 2019
Article
Poelmans J., Ignatov D. I., Kuznetsov S. et al. Expert Systems with Applications. 2013. Vol. 40. No. 16. P. 6538-6560.

This is the second part of a large survey paper in which we analyze recent literature on Formal Concept Analysis (FCA) and some closely related disciplines using FCA. We collected 1072 papers published between 2003 and 2011 mentioning terms related to Formal Concept Analysis in the title, abstract and keywords. We developed a knowledge browsing environment to support our literature analysis process. We use the visualization capabilities of FCA to explore the literature, to discover and conceptually represent the main research topics in the FCA community. In this second part, we zoom in on and give an extensive overview of the papers published between 2003 and 2011 which applied FCA-based methods for knowledge discovery and ontology engineering in various application domains. These domains include software mining, web analytics, medicine, biology and chemistry data.

Added: Oct 3, 2013
Article
Poelmans J., Kuznetsov S., Ignatov D. I. et al. Expert Systems with Applications. 2013. Vol. 40. No. 16. P. 6601-6623.

This is the first part of a large survey paper in which we analyze recent literature on Formal Concept Analysis (FCA) and some closely related disciplines using FCA. We collected 1072 papers published between 2003 and 2011 mentioning terms related to Formal Concept Analysis in the title, abstract and keywords. We developed a knowledge browsing environment to support our literature analysis process. We use the visualization capabilities of FCA to explore the literature, to discover and conceptually represent the main research topics in the FCA community. In this first part, we zoom in on and give an extensive overview of the papers published between 2003 and 2011 on developing FCA-based methods for knowledge processing. We also give an overview of the literature on FCA extensions such as pattern structures, logical concept analysis, relational concept analysis, power context families, fuzzy FCA, rough FCA, temporal and triadic concept analysis and discuss scalability issues.

 

Added: Oct 3, 2013
Article
Li J., Pardalos P. M., Sun H. et al. Expert Systems with Applications. 2015. Vol. 42. No. 7. P. 3551-3561.

Although the multi-depot vehicle routing problem with simultaneous deliveries and pickups (MDVRPSDP) is often encountered in real-life scenarios of transportation logistics, it has received little attention so far. Particularly, no papers have ever used metaheuristics to solve it. In this paper a metaheuristic based on iterated local search is developed for MDVRPSDP. In order to strengthen the search, an adaptive neighborhood selection mechanism is embedded into the improvement steps and the perturbation steps of iterated local search, respectively. To diversify the search, new perturbation operators are proposed. Computational results indicate that the proposed approach outperforms the previous methods for MDVRPSDP. Moreover, when applied to VRPSDP benchmarks, the results are better than those obtained by large neighborhood search, particle swarm optimization, and ant colony optimization approach, respectively.

Added: Jan 23, 2015
Article
Antipov E. A., Pokryshevskaya E. B. Expert Systems with Applications. 2012. Vol. 39. No. 2. P. 1772-1778.

To the best knowledge of authors, the use of Random forest as a potential technique for residential estate mass appraisal has been attempted for the first time. In the empirical study using data on residential apartments the method performed better than such techniques as CHAID, CART, KNN, multiple regression analysis, Artificial Neural Networks (MLP and RBF) and Boosted Trees. An approach for automatic detection of segments where a model significantly underperforms and for detecting segments with systematically under- or overestimated prediction is introduced. This segmentational approach is applicable to various expert systems including, but not limited to, those used for the mass appraisal.

Added: Oct 4, 2012
Article
Ignatov D. I., Nikolenko S. I., Abaev T. et al. Expert Systems with Applications. 2016. Vol. 55. P. 546-558.

We present a new recommender system developed for the Russian interactive radio network FMhost. To the best of our knowledge, it is the first model and associated case study for recommending radio stations hosted by real DJs rather than automatically built streamed playlists. To address such problems as cold start, gray sheep, boosting of rankings, preference and repertoire dynamics, and absence of explicit feedback, the underlying model combines a collaborative user-based approach with personalized information from tags of listened tracks in order to match user and radio station profiles. This is made possible with adaptive tag-aware profiling that follows an online learning strategy based on user history. We compare the proposed algorithms with singular value decomposition (SVD) in terms of precision, recall, and normalized discounted cumulative gain (NDCG) measures; experiments show that in our case the fusion-based approach demonstrates the best results. In addition, we give a theoretical analysis of some useful properties of fusion-based linear combination methods in terms of graded ordered sets.

Added: Jun 28, 2016
Article
Ignatov D. I., Nikolenko S. I., Abaev T. et al. Expert Systems with Applications. 2015. Vol. 55. P. 546-558.

We present a new recommender system developed for the Russian interactive radio network FMhost. To the best of our knowledge, it is the first model and associated case study for recommending radio stations hosted by real DJs rather than automatically built streamed playlists. To address such problems as cold start, gray sheep, boosting of rankings, preference and repertoire dynamics, and absence of explicit feedback, the underlying model combines a collaborative user-based approach with personalized information from tags of listened tracks in order to match user and radio station profiles. This is made possible with adaptive tag-aware profiling that follows an online learning strategy based on user history. We compare the proposed algorithms with singular value decomposition (SVD) in terms of precision, recall, and normalized discounted cumulative gain (NDCG) measures; experiments show that in our case the fusion-based approach demonstrates the best results. In addition, we give a theoretical analysis of some useful properties of fusion-based linear combination methods in terms of graded ordered sets.

Added: May 4, 2016
Article
Korürek M., Yüksel A., Iscan Z. et al. Expert Systems with Applications. 2010. Vol. 37. No. 3. P. 1946-1954.

X-ray bone images are used in the areas such as bone age assessment, bone mass assessment and examination of bone fractures. Medical image analysis is a very challenging problem due to large variability in topologies, medical structure complexities and poor image modalities such as noise, low contrast, several kinds of artifacts and restrictive scanning methods. Computer aided analysis leads to operator independent, subjective and fast results.

In this study, near field effect of X-ray source is eliminated from hand radiographic images. Firstly, near field effect of X-ray source is modeled, then the parameters of the model are estimated by using genetic algorithms. Near field effect is corrected for all image pixels retrospectively.

Two different categories of images are analyzed to show the performance of the developed algorithm. These are original X-ray hand images and phantom hand images. Phantom hand images are used to analyze the effect of noise. Two performance criteria are proposed to test the developed algorithm: Hand segmentation performance and variance value of the pixels in the background. It is observed that the variance value of the pixels in the background decreases, and hand segmentation performance increases after retrospective correction process is applied.

Added: Jan 22, 2015
Article
Iscan Z., Dokur Z., Ölmez T. Expert Systems with Applications. 2010. Vol. 37. No. 3. P. 2540-2549.
In this study, a novel method is proposed for the detection of tumor in magnetic resonance (MR) brain images. The performance of the novel method is investigated on one phantom and 20 original MR brain images with tumor and 50 normal (healthy) MR brain images. Before the segmentation process, 2D continuous wavelet transform (CWT) is applied to reveal the characteristics of tissues in MR head images. Then, each MR image is segmented into seven classes (six head tissues and the background) by using the incremental supervised neural network (ISNN) and the wavelet-bands. After the segmentation process, the head is extracted from the background by simply discarding the background pixels. Symmetry axis of the head in the MR image is determined by using moment properties. Asymmetry is analyzed by using the Zernike moments of each of six tissues segmented in the head: two vectors are individually formed for the left and right hand sides of the symmetry axis on the sagittal plane by using the Zernike moments of the segmented tissues in the head. Presence of asymmetry and the tumors are inquired by considering the distance between these two vectors. The performance of the proposed method is further investigated by moving the location of the tumor and by modifying its size in the phantom image. It is observed that tumor detection is successfully realized for the tumorous 20 MR brain images.
Added: Jan 22, 2015
Article
Zelenkov Y., Fedorova E., Chekrizov D. Expert Systems with Applications. 2017. Vol. 88. P. 393-401.

By present, many models of bankruptcy forecasting have been developed, but this area remains a field of research activity; little is known about the practical application of existing models. In our opinion, this is because the use of existing models is limited by the conditions in which they are developed. Another question concerns the factors that can be significant for forecasting. Many authors suggest that indicators of the external environment, corporate governance as well as firm size contain important information; on the other hand, the large number of factors does not necessary increase predictive ability of a model. In this paper, we suggest the genetic algorithm based two-step classification method (TSCM) that allows both selecting the relevant factors and adapting the model itself to application. Classifiers of various models are trained at the first step and combined into the voting ensemble at the second step. The combination of random sampling and feature selection techniques were used to ensure the necessary diversity level of classifiers at the first step. The genetic algorithms are applied at the step of features selection and then at the step of weights determination in ensemble. The characteristics of the proposed method have been tested on the balanced set of data. It included 912 observations of Russian companies (456 bankrupts and 456 successful) and 55 features (financial ratios and macro/micro business environment factors). The proposed method has shown the best accuracy (0.934) value among tested models. It has also shown the most balanced precision-recall ratio. It found bankrupts (recall = 0.953) and not bankrupts (precision = 0.910) rather accurately than other tested models. The ability of method to select the task-relevant features has been also tested. Excluding the features that are significant for less than 50% of the classifiers in the ensemble improved the all performance metrics (accuracy = 0.951, precision = 0.932, recall = 0.965). So, the proposed method allows to improve the advantages and alleviate the weaknesses inherent in ordinary classifiers, enabling the business decisions support with a higher reliability.

Added: Oct 30, 2017
Article
Savchenko A., Belova N. S. Expert Systems with Applications. 2018. Vol. 108. P. 170-182.

The paper deals with unconstrained face recognition task for the small sample size problem based on computation of distances between high-dimensional off-the-shelf features extracted by deep convolution neural network. We present the novel statistical recognition method, which maximizes the likelihood (joint probabilistic density) of the distances to all reference images from the gallery set. This likelihood is estimated with the known asymptotically normal distribution of the Kullback–Leibler discrimination between nonnegative features. Our approach penalizes the individuals if their feature vectors do not behave like the features of observed image in the space of dissimilarities of the gallery images. We provide the experimental study with the LFW (Labeled Faces in the Wild), YTF (YouTube Faces) and IJB-A (IARPA Janus Benchmark A) datasets and the state-of-the-art deep learning-based feature extractors (VGG-Face, VGGFace2, ResFace-101, CenterFace and Light CNN). It is demonstrated, that the proposed approach can be applied with traditional distances in order to increase accuracy in 0.3–5.5% when compared to known methods, especially if the training and testing images are significantly different.

Added: May 17, 2018