Try   HackMD

Home | About | Researches | Side projects | Life gallery


Researches

HEP big data statistics Energy Network

Here I wrote the introductions for my current work (Gogoro, since 2018) and Ph.D publications (CERN, 2011~2017). The Gogoro

two journal publications and the approved results by CERN (CERN Approval), during my Ph.D candidate and researcher at CERN for the High Energy Physics. They are about the big data analysis. Hope these friendly and simple contents can help reader to enjoy!

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →

Energy Netwrok of Electrical Vehicle (since 2018)

Mathematical Model Development


[Queue Theory] Energy Service Model of Bettary-Swap Platform
math statistics

High Energy Physics (2011~2017)

High energy physics (HEP) is a field trying to discover the deep mystery of our universe. Several popular topics in the HEP are still unknown and waiting for us to figure out, e.g. how does the mess come from? where is the anti-matter? what happens right after the big bang? etc That remains a lot of questions when the human look at the universe. The well-known model provide the basic concept for these puzzles is the Standard Model, which was the ensemble of several theories about the interactions of the "forces" within the "fundamental particles".

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →

The physicists believe that the understanding of the particles helps us to know about the universe. Several experiments thus are designed and on-going for this exciting field. There are varied detections around the world for the cosmic rays, nuclear-power factories, accelerators etc The biggest accelerator in the world where I was involved is the Large Hadron Collider (LHC) located at CERN, Geneva, Switzerland. It accelerate the two bunches of proton to nearly the speed of light (~0.99999C) and collides them. Since they carry the extremely high energy, the collision mimics the big bang when the universe was created. The fragments provide the valuable information about the particles and begin of universe. The information is collected and stored by the detectors with the electronic digits, and then they are reconstructed to particles kinematic information. Thus, that data is big enough (well, never enough actually) to be analyzed by experimental physicists with statistics and physics models.

I am a experimental physicist and specialize the statistical data analysis for the HEP. Here are my researches with the big data (petabyte scale) collected by Compact Muon Solenoid (CMS) Collaboration at LHC. They have been reviewed and published by international journals and CERN. The details about the techniques and theories can be found in their references. The software for achieving the analysis are basing on the C++ and python programing languages, they were processed on the CERN grid-server system. The data visualization is made by ROOT which is a powerful tool for dealing with the statistics in particle physics and cosmology.

Journal Publication


[Regression] Top-qaurk CP violation in LHC data
chi-square likelihood Monte Carol asymmetry statistics



[Statistics] b' quarks decaying to b quark and Higgs boson in LHC data
upper limit background model Monte Carol statistics

CERN Approval


[Algorithm] Preshower Detector Alignment
gradient decent minimum chi-squared rotation matrix statistics



[Validation] Double b-tagging performance
ROC efficiency error propagation Monte Carol statistics



[Validation] b-tagging commissioning
likelihood classification Monte Carol statistics collaboration



Github | Linkedin