Ciências e Tecnologia | Capítulos/artigos em livros internacionais / Book chapters/papers in international books
URI permanente para esta coleção:
Navegar
Entradas recentes
- BCpbP: Blue Circular Post-Branding Project, a local Portuguese case study for a blue circular economyPublication . Duarte, Carlos; Farinha, Isabel; Miguel, Rui; Pestana, Gabriel; Leal, Nuno Sá; Seixas, Sónia; Robalo, José; Jarboui, B.; Toumi, S.; Siarry, P.Portugal has committed itself to the 2030 Agenda for Sustainable Development, adopted by the United Nations in September 2015. Since its creation, Portugal has prioritised the conservation and sustainable use of the ocean. Promoting these goals is a key strategic priority for Portugal. According to BCpbP data, around 2400 tons of fishing gear are destroyed or abandoned every year in Portugal, based on fishing activities in the fishing ports of the Cascais Captaincy. This case study in Portugal aimed to (1) create value from marine waste, (2) promote sustainable design alternatives to current consumption patterns, and (3) raise awareness and literacy about ocean sustainability, contributing to the Blue Economy. An R&D methodology adapted to the concept of Living Labs was adopted, in which innovations such as services, products, or application improvements are created and validated in a collaborative metacontext environment. As a result, a macro-process was established as a chain of custody for the circular economy called the Blue Big Bag. From this, it can be said that BCpbP can be an example of a blue circular economy approach. Its main purpose is to combat excessive consumption of goods and reduce unnecessary waste, while at the same time sensitising our partners to environmental education. On the other hand, it also aims to contribute to the removal of tons of end-of-life materials and products from the ocean, namely fishing nets and gear (DFG) and other marine litter, [5] in order to create new and sustainable products.
- Population growth and geometrically-thinned extreme value theoryPublication . Brilhante, Maria de Fátima; Gomes, Maria Ivette; Mendonça, Sandra; Pestana, Dinis; Pestana, Pedro Duarte; Henriques-Rodrigues, L.; Menezes, R.; Machado, L.M.; Faria, S.; de Carvalho, M.Starting from the simple Beta(2,2) model, connected to the Verhulst logistic parabola, several extensions are discussed, and connections to extremal models are revealed. Aside from the classical general extreme value model, extreme value models in randomly stopped extremes schemes are also discussed. Logistic and Gompertz growth equations are the usual choice to model sustainable growth. Therefore, observing that the logistic distribution is (geo)max-stable and the Gompertz function is proportional to the Gumbel max-stable distribution, other growth models, related to classical and to geometrically thinned extreme value theory are investigated.
- Cyber-vulnerabilities life cycle and risk assessmentPublication . Pestana, Pedro Duarte; Rocha, Maria Luísa; Sequeira, Fernando; Lovric, MiodragCyber-Vulnerabilities Life Cycle and Risk Assessment - Dictionary Entry
- BetaBoop function, BetaBoop random variables and extremal population growthPublication . Brilhante, Maria de Fátima; Pestana, Pedro Duarte; Lovric, MiodragBetaBoop Function, BetaBoop Random Variables and Extremal Population Growth - Dictionary Entry
- Risk assessment of vulnerabilities exploitationPublication . Brilhante, Maria de Fátima; Pestana, Pedro Duarte; Rocha, Maria Luísa; Sequeira, Fernando; Henriques-Rodrigues, L.; Menezes, R.; Faria, S.Using the Kolmogorov–Smirnov, Cramér–von Mises and Anderson– Darling tests, and the not so commonly applied Vuong’s test, it is shown that a two components hyperlog-logistic distribution, i.e., a mixture of two geo-max-stable log-logistic distributions, provides a good fit for the time from disclosure to update of vulnerabilities sampled from the CVEdetails.com database. It is also shown that the hyperlog-logistic distribution provides a better fit than a heavy-tailed distribution of maxima, or a log-logistic distribution, or even a heavy-tailed two components hyperexponential distribution. Moreover, ways of incorporating uncertainty and of modeling vulnerabilities lifecycle into the Common Vulnerabilities Scoring System (CVSS), the most widely used score to assess severity of vulnerabilities, are discussed, in order to obtain an improved CVSS calculator and the evolution of a score over time.
- Causal machine learning in social impact assessmentPublication . Lopes, Nuno Castro; Cavique, Luís; Moutinho, Luiz; Cavique, Luís; Bigné, EnriqueSocial impact assessment is a fundamental process to verify the achievement of the objectives of interventions and, consequently, to validate investments in the social area. Generally, this process is based on the analysis of the average effects of the intervention, which does not allow a detailed understanding of the individualization of these effects. Causal machine learning methods mark an evolution in causal inference, as they allow for a more heterogeneous assessment of the effects of interventions. Applying these methods to evaluate the impact of social projects and programs offers the advantage of improving the selection of target audiences and optimizing and personalizing future interventions. In this chapter, in a non-technical way, the authors explore classical causal inference methods to estimate average effects and new causal machine learning methods to evaluate heterogeneous effects. They address adapting the Uplift Modeling method to assess social interventions. They also address the advantages, limitations, and research needs for using these new techniques in social intervention.
- Impact of artificial intelligence in industry 4.0 and 5.0Publication . Moutinho, Luiz; Cavique, Luís; Moutinho, Luiz; Cavique, Luís; Bigné, EnriqueIndustry 4.0 uses the network concept to establish an interconnected manufacturing system. Industry 4.0 integrates the more recent digital concepts such as artificial intelligence (AI), the internet of things (IoT), big data, cloud computing, and 3D printing. The next maturity level, Industry 5.0, aims to shift the focus back to human-centric production by creating a sustainable and collaborative environment with humans and machines. Every manufacturer aims to find new ways to increase profits, reduce risks, and improve production efficiency. AI tools can process and interpret vast volumes of data from the production floor to spot patterns, analyze and predict consumer behavior, and detect real-time anomalies in production processes. This work studies the impact of AI in Industries 4.0 and 5.0. In Industry 4.0, AI can help in classic tasks such as predictive maintenance, production optimization, and customer personalization. Industry 5.0 enables sustainable manufacturing development and human-AI interaction. In this work, the authors demonstrate the impact of AI in Industry 4.0 and 5.0.
- Causality: the next step in artificial intelligencePublication . Cavique, Luís; Moutinho , Luiz; Cavique , Luís; Bigné, EnriqueJudea Pearl’s ladder of causation framework has dramatically influenced the understanding of causality in computer science. Despite artificial intelligence (AI) advancements, grasping causal relationships remains challenging, emphasizing the causal revolution’s significance in improving AI’s understanding of cause and effect. The work presents a novel taxonomy of causal inference methods, clarifying diverse approaches for inferring causality from data. It highlights the implications of causality in responsible AI and explainable AI (xAI), addressing bias in AI systems. The chapter points out causality as the next step in AI for creating new questions, developing causal tools, and clarifying opaque models with xAI approaches. The work clarifies causal models’ significance and implications in various AI subareas.
- Networks and connectivity: metrics and modelsPublication . Cavique, LuísThis work explores network science to understand and visualize the intricate interconnectivity within organizations. The age of big data emphasizes the importance of deriving new insights by transforming data into networks to study their connections. The document introduces a three-step maturity framework for navigating network science, starting with the basics of network construction, moving on to standard metrics, and examining network topology and dynamics. The authors aim to clarify the subject and encourage further exploration, suggesting that while network science may not have all the answers, it offers a critical analytical framework.
- Texture for neuroimagingPublication . Nunes, Ana; Serranho, Pedro; Castelo-Branco, Miguel; Bernardes, RuiTexture analysis is an umbrella term for multiple image analysis techniques that quantify and characterize the distribution of the image’s gray levels. It has a natural application in biomedical image analysis, where texture-based techniques are increasingly being incorporated into neuroimaging research. In this chapter, the role of texture analysis in the field of neuroimaging is addressed. Neuroimaging applications of texture-based approaches are contextualized within past and recent developments in image texture analysis, the categories of texture analysis methods, and the typical texture-based problem types, namely, classification and segmentation. Neuroimaging applications using magnetic resonance imaging, positron emission tomography, and optical coherence tomography are individually reviewed, and some considerations on the future perspectives for texture-based approaches in neuroimaging are made.
