Scientific Research

كلية المعارف الجامعة تحصل على مرتبة متقدمة بنشر البحوث في مستوعبات كلاريفت و سكوبس

Abstract This paper illustrates the utilise of various kind of machine learning approaches based on support vector machines for classifying Sickle Cell Disease data set. It has demonstrated that support vector machines generate an essential enhancement when applied for the pre-processing of clinical time-series data set. In this aspect, the objective of this study is to present discoveries for a number of classes of approaches for therapeutically associated problems in the purpose of acquiring high accuracy and performance. The primary case in this study includes classifying the dosage necessary for each patient individually. We applied a number of support vector machines to examine sickle cell data set based on the performance evaluation metrics. The result collected from a number of models have indicated that, support vector Classifier demonstrated inferior outcomes in comparison to Radial Basis Support Vector Classifier. For our Sickle cell data sets, it was found that the Parzen Kernel Support Vector Classifier produced the highest levels of performance and accuracy during training procedure accuracy 0.89733, AUC 0.94267. Where the testing set process, accuracy 0.81778, the area under the curve with 0.86556.

Indexed in Scopus

Abstract This paper illustrates the utilise of various kind of machine learning approaches based on support vector machines for classifying Sickle Cell Disease data set. It has demonstrated that support vector machines generate an essential enhancement when applied for the pre-processing of clinical time-series data set. In this aspect, the objective of this study is to present discoveries for a number of classes of approaches for therapeutically associated problems in the purpose of acquiring high accuracy and performance. The primary case in this study includes classifying the dosage necessary for each patient individually. We applied a number of support vector machines to examine sickle cell data set based on the performance evaluation metrics. The result collected from a number of models have indicated that, support vector Classifier demonstrated inferior outcomes in comparison to Radial Basis Support Vector Classifier. For our Sickle cell data sets, it was found that the Parzen Kernel Support Vector Classifier produced the highest levels of performance and accuracy during training procedure accuracy 0.89733, AUC 0.94267. Where the testing set process, accuracy 0.81778, the area under the curve with 0.86556.

Indexed in Scopus

Abstract: As temperatures rise globally, parts of the water cycle will likely speed up due to climate change as evapotranspiration rates increase throughout the world. In this study, three models have been applied to predict the daily evapotranspiration (ET o ) over Santaella station, which is located in Spain. The models are Hargreaves-Samani (HS), modified Hargreaves-Samani (MHS), and Group Method of Data Handling neural network (GMDH-NN). These models are developed using very limited data (temperature parameter). The study found that the HS approach provides the poorest prediction, while the GMDH performance was superior to the MHS. Furthermore, the GMDH-NN model showed a prediction improvement of 16.45% in terms of uncertainty at 95% compared to the MHS model. The study also showed that it is possible to efficiently predict the ET o using a very limited number of meteorological parameters.

Indexed in Scopus

Abstract Despite modern advances used to estimate the discharge coefficient (), it is still a major challenge for hydraulic engineers to accurately determine for side weirs. In this study, extra tree regression (ETR) was used to predict the of rectangular sharp-crested side weirs depending on hydraulic and geometrical parameters. The prediction capacity of the ETR model was validated with two predictive models, namely, extreme learning machine (ELM) and random forest (RF). The quantitative assessment revealed that the ETR model achieved the highest accuracy in the predictions compared to other applied models, and also, it exhibited excellent agreement between measured and predicted (correlation coefficient is 0.9603). Moreover, the ETR achieved 6.73% and 22.96% higher prediction accuracy in terms of root mean square error in comparison to ELM and RF, respectively. Furthermore, the performed sensitivity analysis shows that the geometrical parameter such as b/B has the most influence on . Overall, the proposed model (ETR) is found to be a suitable, practical, and qualified computer-aid technology for modeling that may contribute to enhance the basic knowledge of hydraulic considerations.

Indexed in Scopus

Abstract: Sentiment classification is a very popular topic for identifying user opinions and has been extensively applied in Natural Language Processing (NLP) tasks. Gated Recurrent Unit (GRU) has been successfully implemented to NLP mechanism with comparable, outstanding results. GRUs network performs better on sequential learning tasks and overcomes the issues of vanishing and explosion of gradients in standard recurrent neural networks (RNNs). In this paper, we describe to improve the efficiency of the GRU framework based on batch normalization and replace traditional tanh activation function with Leaky ReLU (LReLU). Empirically, we present that our model, with slight hyperparameters, and tuning the statistic vectors, obtains excellent results on benchmark datasets for sentiment classification. The proposed BN-GRU model performs well as compared to various existing approaches in terms of accuracy and loss function. The experimental results has shown that the proposed model achieved better performance over several stateof-the-art approaches on two benchmark datasets, IMDB dataset with 82.4% accuracy, and SSTb dataset with 88.1% binary classification accuracy and 49.9% Fine-grained accuracy respectively. The proposed results.

Indexed in Scopus

Abstract:This paper introduces an improved advanced encryption standard (AES) cipher algorithm by proposing a new algorithm based on magic square to decrease the AES execution time. This is done by replacing Mixcolumn function with magic square of order 6. This paper raises the security level of AES cryptosystem by using another key, which is generated using magic square while decreasing the execution time. For application of encrypting a colouring image, visual studio and MATLAB programs have been used as means for computing results. The results of complexity, time execution, National Institute of Standards and Technology (NIST) tests, histogram, differential attacks and peak signal to noise ratio (PSNR) are computed and compared with the original AES cryptosystem, the original and the proposed algorithms. The proposed algorithm results in reasonable findings under several evaluation metrics. For instance, the complexity of our proposed algorithm is higher than the basic AES while decreases the time execution. The experimental results show that the suggested algorithm provides an efficient and secure way for image encryption. The suggested algorithm leads to leverage the complexity of cipher process as well as to make the linear and differential cryptanalysis harder by pre-process the input image initial step of the proposed AES

Indexed in Scopus

Abstract This research presents a new hybridized evolutionary artificial intelligence (AI) model for modeling depth scouring under submerged weir (��). The proposed model is based on the hybridization of the Extreme Gradient Boosting (XGBoost) model and genetic algorithm (GA) optimizer. The GA is hybridized to solve the hyper-parameter problem of the XGBoost model and to recognize the influential input predictors of ��. The proposed XGBoost-GA model is developed based on the incorporation of fifteen physical parameters of submerged weir. The feasibility of the XGBoost-GA model is validated against several well-established AI models introduced in the literature in addition to a hybrid XGBoost-Grid model. Several statistical performance metrics is computed for the modeling evaluation in parallel with a graphical assessment. Based on the attained prediction results, the proposed model revealed an optimistic and superior predictability performance with a maximum coefficient of determination (�2 = 0.933) and a minimum root mean square error (RMSE = 0.014 m). In addition, the XGBoost-GA model

Indexed in Scopus

Abstract Burn damage is a complicated trauma that causes local and general tissue edema as a result of cell breakage and capillary leak syndrome. Angiogenesis plays a key part in the mechanisms that are initiated by tissue damage (e.g., burns) since it works directly and precisely on endothelial cells. The primary mediators of angiogenesis are vascular endothelial growth factor (VEGF) and its receptors (VEGFR-1 and VEGFR-2). This study aimed to figure out what functions VEGF and its receptors play in wound healing after burn, and the systemic release of VEGF in people following severe burn damage. This study included 23 burnt adult serum and 20 healthy controls. The enzyme-linked immunosorbent test was used to assess circulating VEGF serum levels and its receptors (VEGFR-1 and VEGFR-2). VEGF serum levels were considerably higher in this study, compared to VEGF levels in healthy controls. The levels of VEGFR-1 and VEGFR-2 have significantly risen; moreover, VEGF and its receptors have a significant impact on edema-related problems in severely burned individuals. Burn is a frequent disease that damages the skin and induces the production of mediators that cause neovasculature in the majority of patients. VEGF, which causes vasculogenesis and angiogenesis, is one of the most important factors in the

Indexed in Scopus

abstract Fluffy Logic empowers creators to control complex frameworks more adequately than customary methodologies. As it gives a basic method to come to positive end result upon questionable, uncertain or uproarious data. Right now have proposed the structure of fluffy rationale controller having three contributions to give right wash time of clothes washer. The goal is to spare parcel of time, power and water for washing the material. The paper portrays the methodology that can be utilized to get a reasonable washing time for various fabrics. The procedure depends completely on the standard of taking non-exact contributions from the sensors exposing them to fluffy number juggling and acquiring a fresh benefit of washing time.

Indexed in Scopus

Abstract: Advanced technology was needed for this within the cap of bandwidth, due to new developments in wireless networking and the increasing requirement of high data levels by consumers. One of the promising solutions is a system with 60 GHz Ultra-Wideband (UWB) that supports a data rate of up to 7 Gbps, considered 10 times faster than the Wi-Fi 802.11n data rate. The system uses advanced modulation modulations and coding systems called adaptive modulation and coding (AMC). However, inappropriate modulation level selection and coding rate can lead, in the event of radio frequency interference and a multi-way decline, to a low data rate. In fact, if non-optimum decoder due to the code’s restriction duration was chosen, more

Indexed in Scopus

Abstract: An LDPC code is a particular linear block code class with low-density parity-check matrix H, i.e. sparse. Because of this parsity in LDPC codes, encoding is low in difficulty and therefore simple to enforce. LDPC codes also provide a wide spectrum of compensation for efficiency and complexity. The high coding complexity of LDPC codes is a major drawback. Where forward error detection and correction codes have been used in large numbers of storage applications or transmitted over many years by wireless services or wireless communication networks due to unreliable network connections.in this review paper, we will present the main concept of LDPC and overview on error correcting codes, decoding and encoding, binary code, and Nonbinary code. Also, this paper will explain the challenges in LDPC.

Indexed in Scopus

Abstract:Vehicular Ad hoc Network (VANET) is sub from Mobile Ad-hoc Networks (MANETs) that has the potential in improving road safety and quality of service (QoS), VANET is confronted with a lot of challenges such as high energy consumption, link instability. In this paper, we survey on discuss this review on three parts. In the first part, we provide a comprehensive review of the quality of service (QoS). In the second part, we will explain some challenges facing the quality of service (QoS). Finally, we will provide a review of data dissemination and all their types.

Indexed in Scopus

Abstract In this study, QualNet Simulator has been used as an environment for testing and simulation in addition to analysis due to its accuracy in results, speed, and ease of portability, thus obtaining actual results before deploying the sensors and computers in the real world. The objective of this study is to establish a scenario to compare the performance of the AODV, DSR, and OLSR routing protocols. The simulation in this study comprises four main steps. Initially, the MANET environment was constructed (analyze the performance of a scalable network). In the last step, a simulation scenario has been used with some parameters like Packet Delivery Ratio and EndtoEnd Delay in five CBR lines of 50 nodes. Finally, the results show which protocol performs better than another which corresponds to different traffic loads. And all those results depend on The CBR line that arrived between the nodes. We obtain some conclusion that AODV has the lowest delay at the End to End Delay and OLSR the highest. The signal obtained by the AODV is less error and maximum for DSR. While OLSR has the least Queue-length. The highest number of packets in DSR and the lowest number in OLSR is obtained.

Indexed in Scopus
كلية المعارف الجامعة الموقع الرسمي