Posts for: #Computer Science

RP2040 Randomness and Ring Oscillator

RP2040 Randomness and Ring Oscillator

Randomness

Having real randomness on a deterministic machine is a pretty interesting topic, while one can easily achieve pseudo-randomness by creating a complex algorithm with unpredictable outputs(such as a chaotic system). However, for a microcontroller with the same code whenever it boots up, the above solution will give you the same number every time. So it would be nice to introduce some external chaotic sources, the most commonly used one might be heat noises, which require some kind of temperature sensor. I also heard Lava Lamps are being used as a chaotic source in some real applications - Cloudflare uses 100 such lamps for their random generator - how bizarre!

Read more →

Generative Adversarial Networks and Game Theory

GAN(Generative Adversarial Network) is a Neural Network model in which there exist two neural networks, one commonly referred to be the Generator and the other as Discriminator. Adversarial Learning is a study of attacking neural networks, but it is being used as a tool to build the GAN model. In each iteration, the Generator will synthesize a product–commonly to be images in modern applications, and the Discriminator will take this product as input and judge if this product is real or fake(produced by neural networks); if it is the second case, the parameters of the Generator will be tuned, the goal is making the product as realistic as possible.

Read more →

CMake vs. Make

What was the Problem? :()

When playing with my RP2040, following the official SDK instructions, I wonder why I have to type cmake then do make, why does it take two steps to build my project?

Well, long story short: Cmake is a cross-platform Makefiles generator, while make “reads the makefile and invokes a compiler, linker, and possibly other programs to make an executable file.(Microsoft)

CMake

Instead of thinking of CMake as a “C/C++ program maker,” I tend to say it as a “Cross-platform maker.” As its design principle says: “CMake is designed to be used in conjunction with the native build environment.”(Cmake.org); thus, it is independent of the Operating System it is working on(thus, the compilers), Which means as long as we configure the CMakeLists.txt correctly, CMake should “generate standard build files (e.g., makefiles on Unix and projects/workspaces in Windows MSVC).” on all the supported OS(Cmake.org)

Read more →

Bare Metal WebServer On Pi PicoW

Bare Metal WebServer On Pi PicoW

Intro

picoW

I’ve been pretty into the Raspberry Pi Pico family lately—it looks nice, and it’s new, there is a fast-growing community there, and it would be cool to play together. Pico W is the newer member with the…well, you guess…the Wireless capability. I thought it would be nice to set up some code to allow me to send data from the terminal to the Pico through wifi.

Arduino-Pico

This is an Arduino Core for the Pico, which based on the official Raspberry Pi Pico SDK but with more add-ons. That basically allows you to use Arduino libraries. You can find their Github link here and the latest documentation here. This project is very active.

Read more →

FAMA 70 Factor Model using Modern MachineLearning Techniques

FAMA 70 Factor Model using Modern MachineLearning Techniques

This paper discovers the advantages/disadvantages of different ML models dealing with a particular dataset.

    / [pdf]
Read more →

Foundations of Machine Learning III

Machine Learning Notes I

Machine Learning Notes II

The Primal Question of Optimization

For a general optimization problem, it usually could be rewritten as maximizing or minimizing a certain function with several constrictions. For example, maybe you want the optimized value non-negative. The most basic form of such is called the primal question which looks like this:

$$ \begin{matrix} \underset{x}{min}f(x), x \in \mathbb{R}^n \newline s.t. \newline g_i(x)\leq 0, i=1,2,…,m \newline h_i(x)= 0, i=1,2,…,p \end{matrix} $$

Read more →

Foundations of Machine Learning II

Machine Learning Notes I

The least squares estimates of α and β

For simple linear regression:

$$ E\left ( Y|X=x \right ) = \alpha +\beta x $$

we have:

$$ \hat{\beta } = \frac{cov\left ( X, Y \right )}{var\left ( X \right )} $$

$$ \hat{\alpha} = \bar{Y} - \hat{\beta}\bar{X} $$

Linear Regression way

We can all use the NN method to solve the regression problem but that leads to being nearly impossible to locate exactly which layer foreshadows which feature of the data. Thus, maybe the better way is to upscale the dimension of the linear regression method. That, we not only use $ x $ but $x, x^{2}, x^{1/2}… $to approach the true curve.

Read more →

Bayes’ Rule

When I was in the high school learning about AP statistics I learned the formula:

$$ P(A|B)=\frac{P(A\cap B)}{P(B)} , P(B|A)=\frac{P(A\cap B)}{P(A)}$$

Which able to be transformed as:

$$ P(A\cap B)=P(A)\cdot P(B|A)=P(B)\cdot P(A|B) $$

$P(A|B)$ is called “Conditional probability” which pretty much self-explained itself. For which I only knew the meaning of each element but not the whole idea, what I do is just plug in numbers, because it is kinda abstract to understand from itself: “The probability of event $A$ happens given event $B$ happened = The probability of events $A and B$ happens divided by the probability of event $B$ happens”

Read more →

Foundations of Machine Learning I

Lately, I was into the studying process of machine learning, and outputting(taking notes) is a vital step of it. Here, I am using Andrew Ng’s Stanford Machine Learning course in Coursera with the language of MATLAB.

So the rest of the code I will write in this post by default are based on MATLAB.

What is ML?

“A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E.” Tom Mitchell

Read more →

The Technical Report of Self-Stabilized TVC Rocket

The Technical Report of Self-Stabilized TVC Rocket
    / [pdf]
Read more →