A new algorithm for identifying the flavour of B0s mesons at LHCb
One of the most challenging data analysis tasks of modern High Energy Physics experiments is the identification of particles. In this proceedings we review the new approaches used for particle identification at the LHCb experiment. Machine-Learning based techniques are used to identify the species of charged and neutral particles using several observables obtained by the LHCb sub-detectors. We show the performances of various solutions based on Neural Network and Boosted Decision Tree models.
Reconstruction and identification in calorimeters of modern High Energy Physics experiments is a complicated task. Solutions are usually driven by a priori knowledge about expected properties of reconstructed objects. Such an approach is also used to distinguish single photons in the electromagnetic calorimeter of the LHCb detector on LHC from overlapping photons produced from high momentum pi0 decays. We studied an alternative solution based on applying machine learning techniques to primary calorimeter information, that are energies collected in individual cells around the energy cluster. Constructing such a discriminator from “first principles” allowed improve separation performance from 80% to 93%, that means reducing primary photons fake rate by factor of two. In presentation we discuss different approaches to the problem, architecture of the classifier, its optimization, and compare performance of the ML approach with classical one.
A full amplitude analysis of Λ 0 b → J/ψ pπ− decays is performed with a data sample acquired with the LHCb detector from 7 and 8 TeV pp collisions, corresponding to an integrated luminosity of 3 fb−1 . A significantly better description of the data is achieved when, in addition to the previously observed nucleon excitations N → pπ−, either the Pc(4380)+ and Pc(4450)+ → J/ψ p states, previously observed in Λ 0 b → J/ψ pK− decays, or the Zc(4200)− → J/ψ π− state, previously reported in B0 → J/ψ K+π − decays, or all three, are included in the amplitude models. The data support a model containing all three exotic states, with a significance of more than three standard deviations. Within uncertainties, the data are consistent with the Pc(4380)+ and Pc(4450)+ production rates expected from their previous observation taking account of Cabibbo suppression.
The production of W and Z bosons in association with jets is studied in the forward region of proton-proton collisions collected at a centre-of-mass energy of 8 TeV by the LHCb experiment, corresponding to an integrated luminosity of 1.98 ± 0.02 fb−1 . The W boson is identified using its decay to a muon and a neutrino, while the Z boson is identified through its decay to a muon pair. Total cross-sections are measured and combined into charge ratios, asymmetries, and ratios of W+jet and Z+jet production cross-sections. Differential measurements are also performed as a function of both boson and jet kinematic variables. All results are in agreement with Standard Model predictions.
This volume presents new results in the study and optimization of information transmission models in telecommunication networks using different approaches, mainly based on theiries of queueing systems and queueing networks .
The paper provides a number of proposed draft operational guidelines for technology measurement and includes a number of tentative technology definitions to be used for statistical purposes, principles for identification and classification of potentially growing technology areas, suggestions on the survey strategies and indicators. These are the key components of an internationally harmonized framework for collecting and interpreting technology data that would need to be further developed through a broader consultation process. A summary of definitions of technology already available in OECD manuals and the stocktaking results are provided in the Annex section.