Your search
Results 278 resources
-
Let A be a Noetherian local ring with canonical module KA. We characterize A when KA is a torsionless, reflexive, or q-torsionfree module for an integer q ≥ 3. If A is a Cohen–Macaulay ring, H.-B. Foxby proved in 1974 that the A-module KA is q-torsionfree if and only if the ring A is q-Gorenstein. With mild assumptions, we provide a generalization of Foxby’s result to arbitrary Noetherian local rings admitting the canonical module. In particular, since the reflexivity of the canonical module is closely related to the ring being Gorenstein in low codimension, we also explore quasinormal rings, introduced by W. V. Vasconcelos. We provide several examples as well. ©2025 Walter de Gruyter GmbH,Berlin/Boston.
-
This paper surveys and summarizes Wolmer Vasconcelos’ results surrounding multiplicities, Hilbert coefficients, and their extensions. We particularly focus on Vasconcelos’ results regarding multiplicities and Chern coefficients, and other invariants which they bound. The Sally module is an important instrument introduced by Vasconcelos for this study, which naturally relates Hilbert coefficients to reduction numbers. ©2025 Walter de Gruyter GmbH,Berlin/Boston.
-
A degree of a module M is a numerical measure of information carried by M. We highlight some of Vasconcelos’ outstanding contributions to the theory of degrees, bridging commutative algebra and computational algebra. We present several degrees he introduced and developed, including arithmetic degree, jdeg, homological degree, cohomological degrees, canonical degree, and bicanonical degree. For the canonical and bicanonical degrees, we discuss recent developments motivated by our joint works [25, 19, 9]. ©2025 Walter de Gruyter GmbH,Berlin/Boston.
-
Coronary heart disease (CHD) is the leading global cause of death, making early detection essential. While coronary angiography is the diagnostic gold standard, its invasive nature poses risks, and non-invasive symptom-based methods often lack accuracy. Machine learning-powered computer-aided diagnostic systems can effectively address challenges in clinical decisionmaking. This work presents an Evolutionary Strategy-optimized Support Vector Machine (ES-SVM) model for classifying CHD based on non-invasive test results and patient characteristics. Using the Coronary Heart Disease dataset, the proposed ESSVM demonstrated significant precision and F1-scores, as well as the accuracy of the proposed model. The results indicate that SVM performance can be significantly enhanced through evolutionary hyperparameter tuning, resulting in a reliable, noninvasive diagnostic tool for initial CAD screening and supporting early intervention techniques. © 2025 IEEE.
-
Freshwater salinization is an emerging threat to aquatic ecosystems across the planet, degrading habitats and negatively impacting wild populations. Deicing practices are a leading cause of freshwater salinization, particularly in the snowbelt region of North America where a variety of salts are widely applied to roads and other surfaces to melt snow and ice. Seasonal pools near roads are considered the most severely impacted aquatic habitats. Runoff into these low water-volume ponds can generate high salinity. Impacts of salt pollution are numerous, ranging from toxicity to population decline to impaired ecosystem function. Here, we investigate a suite of physiological consequences of salinization across multiple life history stages of the wood frog (Rana sylvatica), a pool-dwelling amphibian. Previous work has shown that salinized populations have diverged from unpolluted populations for a suite of physiological, morphological, and reproductive traits, and can experience severe edema (bloating) during the breeding season. Here, we measured swim performance before and after aspirating edema in wild captured wood frogs to show that edema compromises adult aquatic locomotion during breeding. We also found that wood frog mothers from salinized ponds produce ova with inherently higher rates of water uptake compared to mothers from unpolluted pools, consistent with countergradient adaptation, but the ova are smaller. Finally, we found that exposure to road salt inhibits expansion of vitelline membranes in developing embryos and is associated with reduced embryo growth. Together, these results reveal the complexity of population level responses to freshwater salinization, highlighting that impacts occur across multiple life history stages, and that local populations might be evolving adaptations to cope with anthropogenic salinity gradients in freshwater habitats. © The Author(s) 2025. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology.
-
Modeling lipase activity aids researchers in optimizing features such as temperature, pH, and substrate concentration to boost enzyme performance. This is essential in biotechnology for progressing the productivity and yield of processes such as fermentation, biodiesel production, and bioremediation. Fermentation is a highly complex, multivariable, and non-linear biotechnological process that produces bioactive materials. This study leverages artificial neural networks (ANN) to predict lipase activity in batch fermentation processes, addressing the inherent challenges in weight learning optimization often encountered with traditional algorithms like Backpropagation (BP). Several metaheuristic algorithms were employed to optimize the Multilayered Perceptron (MLP) structure and weights, including moth-frequency optimization (MFO), Particle Swarm Optimization (PSO), Dandelion Optimizer Algorithm (DO), Crow Search Algorithm (CSA), and Salp Swarm Algorithm (SSA) to overcome these limitations. Among the tested algorithms, MFO emerged as the most effective approach, achieving superior performance in weight learning with the best fitness value (i.e., mean square error (MSE)) of 0.6006. MFO-optimized ANN models deliver the most accurate predictions for lipase activity, highlighting their potential as a powerful tool for advancing industrial fermentation process optimization. © 2025 IEEE.
-
Traditional brain tumor diagnosis and classification are time-consuming and heavily reliant on radiologist expertise. The ever-growing patient population generates vast data, rendering existing methods expensive and inefficient. Deep Learning (DL) is a promising approach for developing automated systems to diagnose or segment brain tumors with high accuracy in less time. Within Deep Learning, Convolutional Neural Networks (CNNs) are potent tools for image classification tasks. This is achieved through a series of specialized layers, including convolution layers that identify patterns within images, pooling layers that summarize these patterns, fully connected layers that ultimately classify the image, and a feedforward layer to produce the output class. This study employed a CNN to classify brain tumors in T1-weighted contrast-enhanced images with various image resolutions, including 30×30, 50×50, 70×70, 100×100, and 150×150 pixels. The model successfully distinguished between three tumor types: glioma, meningioma, and pituitary. The CNN's impressive accuracy on training data reached up to 86.38% for image resolution (30×30) and 94.64% for higher resolution (150×150). This indicates its potential as a valuable tool in real-world brain tumor classification tasks. © 2025 IEEE.
-
The λ-fold complete 3-uniform hypergraph on v vertices has the edge multiset consisting of λ copies of each 3-element subset of its vertex set. A tight 6-cycle, denoted TC6, is a hypergraph with vertex set {a,b,c,d,e,f} and edge set {{a,b,c},{b,c,d},{c,d,e},{d,e,f},{e,f,a},{f,a,b}}. We give necessary and sufficient conditions on v for the existence of a TC6-decomposition of the λ-fold complete 3-uniform hypergraph on v vertices for any positive integer λ. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2025.
-
Intensity interferometry, also known as the Hanbury Brown and Twiss effect, has seen significant interest in astronomy in recent years. The method involves recording timing correlations between photons received at two or more telescopes in order to derive extremely high spatial resolution information about an astronomical object, potentially including imaging stellar surfaces and other objects at unprecedented scales. This paper will briefly review the technique, discuss the performance characteristics of the of photon counters used in modern intensity interferometers, and describe opportunities for the future. As an example of photon counting with a working instrument, observing experiences with the Southern Connecticut Stellar Interferometer (SCSI), a three-station instrument using single-photon avalanche diode (SPAD) detectors, will be described. The recent lessons learned with this and other instruments in use today give a clear picture of the next steps needed to upgrade efficiency and successfully observe fainter objects. If successful, these improvements would provide a strong argument for creating situations where intensity interferometers can have baselines of one to several kilometers, which would unlock the spatial detail needed to address several exciting astrophysical questions.
-
The exponential growth of big data, driven by AI and machine learning technologies, underscores the need for an ethical and sustainable approach to data utilization. Using problematization methodology, we consider the assumptions underpinning Big Data and AI and reconsider them from a sensemaking perspective. Big data represents an enactment rather than an objective reality, and organizations play an active role in its adoption and use. Strategizing is driven by plausibility rather than accuracy, and big data generates a retrospection of the past rather than a prediction of the future. A sensemaking perspective serves as reality check for managers, emphasizing the necessity of long-term sustainability and societal well-being. By cultivating experiments for learning communities and incubating innovation, organizations can effectively leverage big data in marketing, fostering transparent, collaborative, ethical, and sustainable data practices. © 2025 IEEE Computer Society. All rights reserved.
-
Lung Adenocarcinoma (LUAD) and lung squamous cell carcinoma (LUSC) are the two main histology subtypes of non-small cell lung cancer (NSCLC) with 70% of total Lung Cancer. In this article we proposed an ensemble-based model for the identification of subtypes of NSCLC using methylation data. Proposed Random Forest-based model along with out of bag (OOB) error based feature selection technique identified the top ten most important CpG sites that are highly differentiator between LUSC and LUAD subtypes of NSCLC with an accuracy, precision and F1 Score of \(97\%\) . The proposed model outperformed the other existing models for the same purpose with huge margin of 12%. Pathway analysis of the proposed 10 CpG sites revealed different pathways for LUAD and LUSC associated genes, LUAD-associated genes primarily participated in TP53, PTEN, GLP-1, Incretin regulation, and apoptosis. Conversely, LUSC-associated genes were predominantly involved in pathways for platelet degranulation, serine biosynthesis, and Nephrin family interaction.
-
Retinal Detachment (RD) is one of the major problems with retinal disorder patients. Till to date there existing no confirmatory sign or marker on retina for the early detection of RD. Therefore, patients may have sudden RD at any time of their life. Moreover, it is completely dependent upon the subjective judgement of ophthalmologist to make the final diagnostic decision on RD. To support the decision making process for the ophthalmologist, in this article we proposed RDNet, a SqueezeNet architecture based deep learning model for the early detection of RD. We used publicly available dataset of 1017 images covering rhegmatogenous RD and control group. The proposed model built on this image set achieved 97.55% sensitivity, 99.26% specificity and 98.23% accuracy in detecting RD. The proposed model outperformed the existing models for the same purpose with the highest area under the ROC curve (AUC) of 0.995. We believe our model will support the early detection of RD in clinical setup and assist the ophthalmologist in identifying RD at its early stage.
-
Photovoltaic systems have proven to be one of the most widely used renewable energies and the best replacement for conventional energy. Yet, their non-linear nature remains a challenge when it comes to extracting maximum power from photovoltaic modules. Therefore, in this work, a nonlinear PID controller has been used to meet the requirements of the photovoltaic system. In addition, to improve system performance and response, metaheuristic search algorithms were introduced into the tuning process of both the NPID controller and conventional PID controller parameters in order to compare them. The use of Artificial Intelligence to fine-tune the controller parameters will enable the optimum values of proportional, integral, derivative and nonlinear gains to be determined as system condition change. Finally, a comparison between the algorithms applied is conducted in terms of efficiency, rise time, settling time and overshoot as well as the overall system stability.
-
Request PDF | Induction of Construal-Level Mindset via Experience of Surprise: An Abstract: Proceedings of the 2018 Academy of Marketing Science (AMS) Annual Conference | An experience of surprise is often an outcome of disconfirmation of expectations and can be associated with positive or negative affect depending... | Find, read and cite all the research you need on ResearchGate
-
Epidermolysis bullosa acquisita (EBA) represents a big challenge as a rare skin disorder, with no established markers for early detection for patients. Moreover, as a rare disease, it is extremely difficult to acquire good number of patient sample to diagnose accurately with high confidence. EBA has many biomarkers very similar to other bullosa diseases and needs specific clinical expertise to detect it using immunofluorescence microscopy. In this study, we introduce a deep learningbased method, EBAnet, that leveraged Convolutional Neural Network (CNN) based model for the detection of EBA based on Direct immunofluorescence (DIF) microscopy image. The proposed EfficientNet-based model achieved 97.3% sensitivity, 96.1% precision, and 96.7% accuracy in distinguishing EBA from other class and outperformed the existing model for the same purpose. GradCAM based class activation map also highlighted the important region of the DIF images that was focused by the proposed model leveraging the explainability of the model. We believe, EBAnet will add value in the early and accurate detection of EBA, addressing a critical need in clinical practice.
-
This study developed a framework for predicting usability factors through an understanding of how cognitive traits relate to human interaction with a computer system. Specifically, this study examined the relationship of field-independence, spatial visualization, logical reasoning, and integrative reasoning to interaction process and outcome. The research hypothesis was tested through correlation to determine the relationships among variables. As a post hoc analysis, multiple regression analysis was used to examine the predictive power of four cognitive variables on interaction outcome. The results of the study emphasize the importance of considering cognitive variables as important predictors to human interaction process and outcome. © 2024 IEEE.
-
Using the 2013 data set provided by Insurance Inc., logistic regression and linear discriminant analysis models were created along with data visualizations to find out which factors recorded in the data set and the state of those factors causes a client to cancel their policy. The factors that impact whether a client will cancel are those that directly pertain to the policy. For example, the coverage type and the premium the client is paying for the policy impacts the probability the client will cancel their policy. Factors that go into forming the policy and have a relationship between one another such as age and premium, also impact the probability that a client will cancel their policy. The credit status of a client, whether it is low, medium, or high, and the type of coverage they have, has the most impact on a client's inevitability to cancel. If a client's credit score is classified as low, then that client is has a high probability of cancelling their policy according to the LDA (Linear Discriminant Analysis) classifier and logistic regression model. Likewise, if a client has coverage type B, the probability that they will cancel their policy is higher. The sales channel used to sell a client a policy also impacts the probability they will cancel. According to the LDA classifier and the logistic regression model, if a client was sold a policy over the phone, they are more likely to cancel. © 2024 IEEE.
-
This research introduces the application of an innovative bio-inspired metaheuristic technique, termed the Crow Search Algorithm (CSA), to model a crucial industrial process - hot rolling manufacturing. Inspired by the foraging patterns of crows, the CSA algorithm has demonstrated its prowess in solving diverse optimization challenges. In the context of this study, the CSA algorithm is harnessed to fine-tune the parameters of a simulation model focused on predicting the force exerted during a hot rolling procedure. The proposed model takes into consideration a range of influential factors, including the initial temperature (Ti), width (Ws), carbon equivalent (Ce), gauge (hi), draft (i), and roll diameter (R). The findings underscore the CSA's capability to deliver an exceptional modeling performance characterized by swift convergence and high solution quality. By getting along very well with the proposed model with the CSA algorithm, a robust and efficient avenue to optimize the hot rolling process emerges, with the potential for expansion into other manufacturing domains. The computational and simulation results demonstrated that the proposed approach-based CSA outperformed different meta-heuristic search algorithms, such as the Salp Swarm Algorithm (SSA), Dandelion Optimizer (DO), Particle Swarm Optimization (PSO), Gray Wolf Optimizer (GWO), and Moth-Flame Optimization (MFO), in all test cases. The CSA has achieved the highest coefficient of determination (R2), equal to 0.97244, and the lowest mean squared error (MSE), equal to 1904.97, compared to its opponent algorithms. © 2024 IEEE.
-
Symbolic regression techniques are promising approaches to learning mathematical models that fit experimental data. One of the most powerful techniques for symbolic regression is Grammatical Evolution (GE). This evolutionary computation technique explores a space of candidate models that are ensured to be syntactically correct expressions built from a set of arbitrary building blocks and operators. In GE the syntax for these expressions is defined by a problem-specific formal grammar. Therefore, GE can produce an explainable solution (e.g. a formula), not a black-box model. The current contribution assesses the viability of GE for PSF characterization, using real datasets from HST/WFPC2. Our experiments show that our method is able to find the most likely candidate mathematical expression for the PSF shape, and can also model combinations of shapes taken from a predefined family of functions commonly used in astronomy (Gaussian and Moffat PSFs). These results support the hypothesis that the expressive power of GE can be used to tackle the problem of characterization of complex PSF functions, for example, as a necessary step in the prediction of intra-pixel position of stars. © 2024 SPIE.
-
In this paper, we develop an indoor positioning system using smartphones. An indoor positioning system plays a vital role in indoor spaces such as home, office, university, airport, and hospital buildings by locating and tracking persons, devices, and assets. Our indoor positioning system is applicable in any indoor spaces which has smart devices such as smartphones, tablets, smartwatches, and robots with a Wi-Fi connection. We used Wi-Fi-based fingerprinting technique t o build o ur indoor positioning system because a Wi-Fi-based system can leverage existing Wi-Fi infrastructure and hence, it is cost effective. A major challenge in implementing a Wi-Fi-based fingerprinting technique is the missed access points (APs) problem. In this paper, we address this critical challenge by proposing a localization procedure called ‘cosine similarity + k-means clustering'. In this localization procedure, we leverage k-means clustering algorithm in identifying the wrong location estimates produced by the cosine similarity measure because of missed APs problem. To evaluate the effectiveness of our proposed localization procedure, we collected data from three different scenarios, specifically, home, office, a nd university f or creating signal m ap a nd performing localization tests. Additionally, we tested both stationary and walk data. Our experimental results prove that our ‘cosine similarity + k-means clustering’ localization procedure is effective in mitigating the detrimental impact of missed APs, and consequently, it significantly improves localization accuracy.
Explore
Resource type
Publication year
- Between 1900 and 1999 (31)
-
Between 2000 and 2026
(247)
- Between 2000 and 2009 (50)
- Between 2010 and 2019 (115)
- Between 2020 and 2026 (82)
Resource language
- English (211)