Browsing by Author "Cakar T."
Now showing 1 - 8 of 8
- Results Per Page
- Sort Options
Conference Object Attention-Enhanced Dual-Head LSTM With Rich Feature Engineering for Risk-Adjusted Stock Return Forecasting(Institute of Electrical and Electronics Engineers Inc., 2025) Patel J.; Gunes P.; Ertugrul S.; Sayar A.; Benli H.; Makaroglu D.; Cakar T.Stock return forecasting is a challenging task due to the complex, nonlinear, and volatile nature of financial markets. In this paper, we propose a comprehensive deep learning framework that integrates: a two-layer Long Short-Term Memory (LSTM) network augmented with a learnable attention mechanism, a dual-head output for simultaneous regression of next-day returns and classification of price direction, with an extensive suite of technical and macro-financial features. Our feature set comprises lagged log-returns, trend indicators (simple and exponential moving averages), momentum oscillators (RSI, MACD), volatility measures (rolling variance and GARCH conditional volatility), price bands (Bollinger Bands, Donchian channels), volume metrics (On-Balance Volume, Volume Rate of Change), Hidden Markov Model regime states, market index returns, and calendar effects. We train and validate the model using a rolling-window cross-validation scheme with early stopping and hyperparameter tuning to ensure temporal robustness. Empirical results on a large multi-stock dataset demonstrate that our attention-enhanced, dual-task LSTM outperforms single-task LSTMs and traditional machine learning benchmarks, achieving lower forecasting error and more stable generalization. © 2025 IEEE.Conference Object Churn Prediction for Subscription-Based Applications Using Machine Learning(Institute of Electrical and Electronics Engineers Inc., 2025) Gozukara H.; Patel J.; Kara E.; Yildiz A.; Mese Y.K.; Obali E.; Cakar T.In this study, a predictive model was developed using machine learning techniques to forecast customer churn in subscription-based video streaming services. The data such as user login records, content viewing information, subscription details, and content-related features were used to interpret usage patterns and customer churn was defined based on subscription renewal status and renewal timing. Several usage-based features are extracted for users and several algorithms were used for modeling, such as Random Forest, CatBoost, XGBoost, Logistic Regression, K-Nearest Neighbors, and Gradient Boosting. Occurring class imbalance on the target variable is handled via BorderLineSMOTE. The model's performance was evaluated using training-test accuracy plots, classification reports, and hyperparameter tuning. Even though most of the models performed similarly, the CatBoost model emerged as the top performer, achieving a macro F1-score of 0.60. However, while effective in identifying churners, the models struggled to precisely classify non-churning customers, a common challenge in imbalanced datasets even after applying oversampling techniques. The analysis of feature importance yielded a crucial insight, early and consistent user engagement is the strongest predictor of customer retention. These findings provide valuable, actionable insights for streaming platforms, emphasizing that retention strategies should focus on maximizing engagement immediately after a user subscribes. © 2025 IEEE.Conference Object Developing Autonomous Steering Algorithm To Improve Cornering Slip Performance of a Four-Wheel Car Using Neural Network Tools(Institute of Electrical and Electronics Engineers Inc., 2025) Alatciyan D.R.; Emeryan B.J.; Barbaros B.; Cakar T.; Kilic N.This study investigates a neural network-based predictive steering control using simulation data generated from ADAMS Car. A Long Short-Term Memory (LSTM) architecture is employed to estimate steering angle and longitudinal velocity from sequential input features, with the goal of analyzing the model's behavior in cornering scenarios. The experimental setup includes multiple simulation runs under varying configurations, particularly exploring the effect of different sliding window sizes on prediction performance. Results show that the proposed model can effectively capture temporal patterns in the input data and produce consistent estimations across test conditions. While the study is limited to a simulation environment, it provides initial insights into how AI-based models may support steering control tasks and lays the groundwork for future extensions involving additional vehicle dynamics inputs. © 2025 IEEE.Conference Object Graph Theory-Based Fraud Detection in Banking Check Transactions(Institute of Electrical and Electronics Engineers Inc., 2025) Behsi Z.; Memis E.C.; Ertugrul S.; Sayar A.; Gunes P.; Seydioglu S.; Cakar T.Traditional banking fraud detection systems rely on rule-based approaches that analyze individual transactions in isolation, failing to capture complex relationship patterns indicative of coordinated fraud schemes such as check-kiting and artificial credit score manipulation. We p resent our study, a novel similarity-based graph theory approach that constructs weighted networks between check issuers using Jaccard Similarity Index and employs advanced graph analysis to identify suspicious entity clusters without requiring complete transaction relationship data. Our approach combines Jaccard Similarity Index for behavioral pattern analysis (addressing payee information unavailability) with comprehensive graph analysis including centrality measures, community detection, and anomaly identification. Through comprehensive evaluation on real banking data containing 458,399 transactions from 121,647 unique issuers - the largest confirmed dataset in fraud detection literature - we demonstrate the effectiveness of our methodology. Following parameter optimization using grid search methodology (similarity threshold: 0.55, risk percentile: 0.75), our study achieves competitive detection rates in optimal configurations with an average F1-score of 0.447 (±0.164) and peak performance reaching an F1-score of 0.557, while providing superior network topology analysis with 0.923 clustering coefficient. The system operates under significant data privacy constraints, lacking personal identification information (names, account numbers, IDs) and complete payee data. Despite these limitations, our study outperforms traditional approaches by leveraging similarity-based indirect relationships, and we project that performance could reach 85-95% levels with complete data access. © 2025 IEEE.Conference Object Multi-Output Vs Single-Output Deep Learning for Plant Disease Detection(Institute of Electrical and Electronics Engineers Inc., 2025) Taha Kara H.B.; Sayar A.; Gunes P.; Guvencli M.; Ertugrul S.; Cakar T.AI-based image processing plays a crucial role in agriculture by enabling early detection of plant diseases, thereby increasing crop productivity and minimizing economic losses. In this study, we present a comparative analysis between a multi-output deep learning model, which simultaneously classifies plant species and assesses their health status, and two separate single-output models trained for each task individually. The publicly available PlantVillage dataset was used for training and evaluation. Performance metrics such as classification accuracy, F1 score, training time, and confusion matrices were used to assess each model. Our results indicate that the multi-output architecture achieves remarkably high classification performance (Plant: 99.98%, Health: 99.78%) while significantly reducing training time by nearly 50% compared to the combined cost of training two individual models. This demonstrates that a unified model not only provides computational efficiency but also maintains predictive strength, making it a practical alternative for real-time agricultural decision support systems. The findings suggest that integrated modeling can contribute to the development of scalable, resource-efficient solutions in precision agriculture. © 2025 IEEE.Conference Object A Multimodal AI and ML Framework for Fashion Image Segmentation, Recommendation, and Similarity Recognition(Institute of Electrical and Electronics Engineers Inc., 2025) Soyhan M.E.; Ay T.B.; Memis E.C.; Fatih Capal M.; Cakar T.; Gunay S.; Coskun H.This study presents a scalable multimodal Artificial Intelligence (AI) and Machine Learning (ML) framework designed to enhance decision making in the fashion industry. The proposed system integrates garment segmentation, multimodal feature extraction, and similarity recommendation into a unified pipeline. Using Segformer for segmentation, along with the convolutional neural network (CNN)-based feature extraction models ResNet152V2 and Xception, and the transformer-based vision-language model LLaVA, the framework generates visual and semantic embeddings of garments. These representations are processed through similarity detection using OpenAI embedding models and stored in the Pinecone vector database for efficient retrieval. Real-time similarity scoring is enabled through FastAPI endpoints, offering interactive search capabilities. Preliminary results demonstrate the system's strong ability to identify conceptually and visually similar items across a large catalog, providing actionable insights for designers. This framework lays the groundwork for intelligent, interpretable, and production-ready AI systems in the fashion industry. © 2025 IEEE.Conference Object Predicting Customer Churn in Retail Using Machine Learning on Transaction Data(Institute of Electrical and Electronics Engineers Inc., 2025) Bozan M.T.; Gozukara H.; Patel J.; Kizilay A.; Sahin Z.; Tosun B.; Cakar T.Customer churn prediction is critical for businesses to retain customers and reduce revenue loss. This paper presents a retail customer churn prediction study. We preprocess transactional data from a retail dataset comprising approximately 19.7 million transactions involving over 1 million customers. Temporal behavioral features, such as purchase frequency, monetary value, product variety, and promotional engagement metrics, are engineered using a four-month observation window. A Random Forest classifier is trained, utilizing balanced class weighting to address churn class imbalance. The churn label is defined as customers not purchasing in the subsequent six-month period. Our Random Forest model achieves approximately 84% accuracy, 86% precision, 85% recall, and an F1- score of 85%. Additionally, an XGBoost model achieves similar accuracy (≈ 84%) but higher recall (93%) and F1-score (89%), indicating improved churn prediction. The confusion matrix illustrates clear model performance. This study demonstrates that carefully engineered RFM-based features and ensemble learning approaches significantly enhance churn prediction in retail contexts. © 2025 IEEE.Conference Object A Predictive Model for Bounced Check Risk Using Machine Learning(Institute of Electrical and Electronics Engineers Inc., 2025) Kaya K.; Sayar A.; Memis E.C.; Ozlem S.; Ertugrul S.; Cakar T.Bounced checks result in direct monetary losses. Traditional rule-based systems cannot adapt to new patterns and lack flexibility. In this study, we used a large and imbalanced check dataset with customer profiles, credit limits, and historical check outcomes. We applied feature engineering emphasizing time-based transaction patterns, extensive clustering, anomaly detection, and inflation adjustment. We trained six models each for two datasets, which are undersampled to handle class imbalance: Logistic Regression, Random Forest, XGBoost, LightGBM, Extra Trees, and CatBoost. The best performing model, CatBoost, achieved macro F1 scores of 88.5 percent on individual checks dataset with a gross sunk rate of 4.92 percent, and 91.7 percent on corporate checks dataset with a gross sunk rate of 4.28 percent. These results show the model can identify checks most likely to bounce before granting and maintain a low gross sunk rate overall. This study presents a data-driven machine learning solution that enables financial companies to predict and prevent bounced checks before they occur. © 2025 IEEE.

