In: Complex networks and their applications VI : proceedings of complex networks 2017 ; the sixth international conference on complex networks and their applications / Chantal Cherifi... (eds.)
Cham : Springer, 2018
(Studies in computational intelligence ; 689)
Abstract: This study proposes ComSim, a new algorithm to detect communities in bipartite networks. This approach generates a partition of nodes by relying on similarity between the nodes in terms of links towards nodes. In order to show the relevance of this approach, we implemented and tested the algorithm on 2 small datasets equipped with a ground-truth partition of the nodes. It turns out that, compared to 3 baseline algorithms used in the context of bipartite graph, ComSim proposes the best communities. In addition, we tested the algorithm on a large scale network. Results show that ComSim has good performances, close in time to Louvain. Besides, a qualitative investigation of the communities detected by ComSim reveals that it proposes more balanced communities.
In: International journal of communication, 12 (2018), p. 450-472
Abstract: Klout scores and similar are often called 'vanity metrics' because they measure and display performance in (what is referred to as) the 'success theater' of social media. The notion of vanity metrics implies a critique of metrics concerning both the object of measurement as well as their capacity to measure unobtrusively or only to encourage performance. While discussing that critique, the article, however, focuses mainly on how one may consider reworking the metrics. In the research project I call 'critical analytics,' the proposal is to repurpose 'alt metrics' scores and other engagement measures for social research, and seek to measure the 'otherwise engaged,' or other modes of engagement (than vanity) in social media such as dominant voice, concern, commitment, positioning and alignment, thereby furnishing digital methods with a conceptual and applied research agenda concerning online metrics.
In: Futures : the journal of policy, planning and futures studies, 95 (2018), p. 118-138
Abstract: Traditional scientific policy approaches and tools are increasingly seen as inadequate, or even counter-productive, for many purposes. In response to these shortcomings, a new wave of approaches has emerged based on the idea that societal systems are irreducibly complex. The new categories that are thereby introduced - like 'complex' or 'wicked' - suffer, however, by a lack of shared understanding. We here aim to reduce this confusion by developing a meta-ontological map of types of systems that have the potential to 'overwhelm us': characteristic types of problems, attributions of function, manners of design and governance, and generating and maintaining processes and phenomena. This permits us, in a new way, to outline an inner anatomy of the motley collection of system types that we tend to call 'complex'. Wicked problems here emerge as the product of an ontologically distinct and describable type of system that blends dynamical and organizational complexity. The framework is intended to provide systematic meta-theoretical support for approaching complexity and wickedness in policy and design. We also points to a potential causal connection between innovation and wickedness as a basis for further theoretical improvement.
Abstract: This paper employs a novel method for the empirical analysis of political discourse and develops a theoretical model that demonstrates dynamics comparable with the empirical data. Applying a set of binary text classifers based on convolutional neural networks, we label statements in the political programs of the Democratic and the Republican Party in the United Statesa. Extending the framework of the Colonel Blotto game by a stochastic activation structure, we show that, under a simple learning rule, parties show temporal dynamics that resemble the empirical data.
Gothenborg, Sweden : Chalmers University of Technology, 2017. - VI, 422 p.
(Doktorsavhandlingar vid Chalmers tekniska högskola ; 4215)
Zugleich: Gothenburg, Chalmers University of Technology, Dissertation
Abstract: This thesis engages with questions on the boundary between what has traditionally been understood as social and natural. The introductory essay contextualizes the specific contributions of the included papers, by noting and exploring a reinvigoration of 'naturalism' (the notion of a continuity between the human realm and the rest of natural phenomena) under the banner of Complexity Science. This notion is put under explicit light, by revisiting the age-old question of naturalism and connecting ideas in complexity science with the work of e.g. Roy Bhaskar, Mario Bunge, William Wimsatt, and David Lane. A philosophical foundation for a complexity science of societal systems is thereby sketched, taking the form of an integrative and methodologically pluralist 'complex realism'. The first two papers provide a theoretical perspective on the distinction between social and natural: Paper I notes that societal systems combine two qualities that are commonly referred to as complexity and complicatedness into an emergent quality that we refer to as 'wickedness', and that is fundamentally and irreducibly different from either quality in isolation. This explains the recalcitrance of societal systems to the powerful approaches that exist for dealing with both of these qualities in isolation, and implies that they indeed ought to be treated as a distinct class of systems. Paper II uses the plane spanned by complexity and complicatedness to categorize seven different system classes, providing a systematic perspective on the study of societal systems. The suggested approach to societal systems following from these conclusions is exemplified by three studies in different fields and empirical contexts. Paper III combines a number of theories that can be seen as responses to wickedness, in the form of evolutionary developmental theories and theories of societal change, to develop a synthetic theory for cultural evolution. Paper IV exemplifies how simulation can be integrated with social theory for the study of emergent effects in societal systems, contributing a network model to investigate how the structural properties of free social spaces impact the diffusion of collective mobilization. Paper V exemplifies how digital trace data analysis can be integrated with qualitative social science, by using topic modeling as a form of corpus map to aid critical discourse analysis, implying a view of formal methods as aids for qualitative exploration, rather than as part of a reductionist approach.
In: Internet histories : digital technology, culture and society, 1 (2017) 1/2, p. 160-172
Abstract: Among the conceptual and methodological opportunities afforded by the Internet Archive, and more specifically, the WayBack Machine, is the capacity to capture and 'play back' the history a web page, most notably a website's homepage. These playbacks could be construed as 'website histories', distinctive at least in principle from other uses put to the Internet Archive such as 'digital history' and 'Internet history'. In the following, common use cases for web archives are put forward in a discussion of digital source criticism. Thereafter, I situate website history within traditions in web historiography. The particular approach to website history introduced here is called 'screencast documentaries'. Building upon Jon Udell's pioneering screencapturing work retelling the edit history of a Wikipedia page, I discuss overarching strategies for narrating screencast documentaries of websites, namely histories of the Web as seen through the changes to a single page, media histories as negotiations between new and old media as well as digital histories made from scrutinising changes to the list of priorities at a tone-setting institution such as whitehouse.gov.
In: The datafied society : studying culture through data / Mirko Tobias Schäfer... (eds.)
Amsterdam : Amsterdam University Press, 2017. - P. 75-94
Abstract: The chapter starts with a short summary of what we consider to be five central challenges concerning the recent move towards Digital Methods. We then interrogate David Berry's concept of 'digital Bildung' as a means of facing these challenges. Our goal in this discussion is, maybe paradoxically, to move the spotlight from 'the digital' and programming, to the plethora of concepts and knowledges mobilized in digital tools. To this end, we discuss three examples that allow us to both concretise and complicate the debate about what kind of skill set is needed by digital scholars.
Armel Jacques Nzekon Nzeko'o
In: Proceedings of the 13th international conference on web information systems and technologies : Volume 1 WEBIST ; April 25-27, 2017, in Porto, Portugal / Tim A. Majchrzak... (eds.)
Setúbal, Portugal : SciTePress, 2017. - P. 268-275
Abstract: Recommender systems are an answer to information overload on the web. They filter and present to customer, a small subset of items that he is most likely to be interested in. Since user's interests may change over time, accurately capturing these dynamics is important, though challenging. The Session-based Temporal Graph (STG) has been proposed by Xiang et al. to provide temporal recommendations by combining long-and short-term preferences. Later, Yu et al. have introduced an extension called Topic-STG, which takes into account topics extracted from tweets' textual information. Recently, we pushed the idea further and proposed Content-based STG. However, in all these frameworks, the importance of links does not depend on their arrival time, which is a strong limitation: at any given time, purchases made last week should have a greater influence than purchases made a year ago. In this paper, we address this problem by proposing Time Weight Content-based STG, in which we assign a time-decreasing weight to edges. Using Time-Averaged Hit Ratio, we show that this approach outperforms all previous ones in real-world situations.
Abstract: Graph theory provides a language for studying the structure of relations, and it is often used to study interactions over time too. However, it poorly captures the both temporal and structural nature of interactions, that calls for a dedicated formalism. In this paper, we generalize graph concepts in order to cope with both aspects in a consistent way. We start with elementary concepts like density, clusters, or paths, and derive from them more advanced concepts like cliques, degrees, clustering coefficients, or connected components. We obtain a language to directly deal with interactions over time, similar to the language provided by graphs to deal with relations. This formalism is self-consistent: usual relations between different concepts are preserved. It is also consistent with graph theory: graph concepts are special cases of the ones we introduce. This makes it easy to generalize higher-level objects such as quotient graphs, line graphs, k-cores, and centralities. This paper also considers discrete versus continuous time assumptions, instantaneous links, and extensions to more complex cases.
Abstract: We study strategic interaction between agents who distill the complex world around them into simpler situations. Assuming agents share the same cognitive frame, we show how the frame affects equilibrium outcomes. In one-shot and repeated interactions, the frame causes agents to be either better or worse off than if they could perceive the environment in full detail: it creates a fog of cooperation or a fog of conflict. In repeated interaction, the frame is as important as agents' patience in determining the set of equilibria: for a fixed discount factor, when all agents coordinate on what they perceive as the best equilibrium, there remain significant performance differences across dyads with different frames. Finally, we analyze some tensions between incremental versus radical changes in the cognitive frame.
In: ASONAM 2017 : proceedings of the 2017 IEEE/ACM international conference on advances in social networks analysis and mining 2017 / Jana Diesner... (eds.)
New York : ACM, 2017. - P. 667-674
Abstract: International audience; The ability of a node to relay information in a network is often measured using betweenness centrality. In order to take into account the fact that the role of the nodes vary through time, several adaptations of this concept have been proposed to time-evolving networks. However, these definitions are demanding in terms of computational cost, as they call for the computation of time-ordered paths. We propose a definition of centrality in link streams which is node-centric, in the sense that we only take into account the direct neighbors of a node to compute its centrality. This restriction allows to carry out the computation in a shorter time compared to a case where any couple of nodes in the network should be considered. Tests on empirical data show that this measure is relatively highly correlated to the number of times a node would relay information in a flooding process. We suggest that this is a good indication that this measurement can be of use in practical contexts where a node has a limited knowledge of its environment, such as routing protocols in delay tolerant networks.
In: Complex networks VIII : proceedings of the 8th Conference on Complex Networks ; CompleNet 2017 / Bruno Goncalves... (eds.)
Cham : Springer, 2017. - P. 81-92
(Springer proceedings in complexity)
Abstract: International audience; The analysis of dynamic networks has received a lot of attention in recent years, thanks to the greater availability of suitable datasets. One way to analyse such dataset is to study temporal motifs in link streams , i.e. sequences of links for which we can assume causality. In this article, we study the relationship between temporal motifs and communities, another important topic of complex networks. Through experiments on several real-world networks, with synthetic and ground truth community partitions, we identify motifs that are overrepresented at the frontier - or inside of - communities.
Abstract: We explore a new mechanism to explain polarization phenomena in opinion dynamics. The model is based on the idea that agents evaluate alternative views on the basis of the social feedback obtained on expressing them. A high support of the favored and therefore expressed opinion in the social environment, is treated as a positive social feedback which reinforces the value associated to this opinion. In this paper we concentrate on the model with dyadic communication and encounter probabilities defined by an unweighted, time-homogeneous network. The model captures polarization dynamics more plausibly compared to bounded confidence opinion models and avoids extensive opinion flipping usually present in binary opinion dynamics. We perform systematic simulation experiments to understand the role of network connectivity for the emergence of polarization.
In: ASONAM 2017 : proceedings of the 2017 IEEE/ACM international conference on advances in social networks analysis and mining 2017 / Jana Diesner... (eds.)
New York : ACM, 2017. - P. 935-942
Abstract: A link stream is a sequence of triplets (t, u, v) meaning that nodes u and v have interacted at time t. Capturing both the structural and temporal aspects of interactions is crucial for many real world datasets like contact between individuals. We tackle the issue of activity prediction in link streams, that is to say predicting the number of links occurring during a given period of time and we present a protocol that takes advantage of the temporal and structural information contained in the link stream. We introduce a way to represent the information captured using different features and combine them in a prediction function which is used to evaluate the future activity of links.