How to do dropout regularisation: A Top Deep Learning Trick that’ll Improve Performance for Data Scientists
Proving Gaussian Processes and Neural Networks are equivalent: The Future of Machine Learning is Here
Variational Inference vs MCMC: Variational Inference vs MCMC for Data Scientists
The problems with social media: Thalidomide and the Children of Social Media
Shortcomings of government reporting: Gross Failures in Coronavirus Reporting
Racial discrimination as biases in ML: How to Remove Racial Discrimination from Data Science
Problems with corporate-sponsored education: Sorry, the TensorFlow Developer Certificate is Pointless
Hardships of working in ML and Data Science: The World-Class Data Scientist
Learning is a bit of a gimmicky title to give a set of
algorithms that have
layers. There’s actually nothing new compared to a
network, but the term
learning (in particular) has been in the news a lot lately.
The hype of this phenomena is well received though. The application of
algorithms on common problems have improved accuracy scores and improved the
deployability of many solutions, however, is this largely due to the fact that now we have more
servers and can actually run these
algorithms, or are
networks actually that much better than everything else? …
The skill set for a researcher is now more broad than ever. We have to know how to code, how to do the maths, and how to use programming tools like
I mean it’s not super hard to learn all of these things but it takes a while to become confident enough to actively engage in this space. How do you even know where to begin?
Either way, knowing
linux is super important on your journey of being a
researcher because for me as a
Terminal is pretty similar to
Mac OS X is a
Unix OS and its command line is 99.9% …
Data Scientists and Machine Learning researchers will both keep a nose around for what’s going on in the community.
I’m a curious individual, so I wanted to see what had been trending this month: all of which I found to be pretty damn interesting.
Here goes it:
The researchers at Facebook have come out with an update to their Pixel-aligned Implicit Function (PIFu) model that aligns pixels of a 2D image with corresponding pixels of a 3D image. Using PIFu, Facebook have made a Deep Learning model (end-to-end) for digitising people, with the ability to infer 3D surface and texture from either a single image, or multiple. …
The ability to format a
string is a pretty basic requirement for anyone that can code, but, there’ve been more than a few ways in the past that you’ve been able to do this in
Python. There’s the original
% method, there’s the
.format method and more recently, there’s the
f-String method. So which do you choose?
I’m pretty lazy so once I got the hang of .
format methods, I kind of stuck to them, but there are drawbacks that I’ll cover which signify the problem with them.
But first, let’s do a quick overview:
This is the classic method, where those who were coding in the early
Python2 days will remember clearly. Essentially you add in a
% score with an ‘s’ (to reflect you want to chuck in a
string) and add the
% sign after the
string as follows. …
Java are relatively different languages. You could say that the
C++ combo are better comparisons but in terms of broad functionality and in terms of being a beginner,
Java are generally the two languages that beginners struggle to decide between.
The reason is that
C++ generally gets discarded because it’s an ‘old’ language and more online references will recommend not to learn it. Likewise, for
Matlab, it has a very specific use case but past that — you can’t really get much more functionality. …
NIPS is classed as one of the foremost academic conferences in the space of AI. For academics, being published or running a workshop in this conference is a sign that you’re doing well and you’re making a difference.
The competition to get into this conference is high, like really really high. So generally speaking, you would expect that the best of the best have published the most in it. Let’s look at the most prolific authors who’ve submitted in the NIPS conferences since 1990:
With names like Gharamani, Hinton, Bengio and Jordan: clearly the big guys in AI have been busy! These guys are well known for revolutionising AI but for good reason as they’ve published so much more than others. …
There are domains where
Networks aren’t actually that powerful (think of problems with high degrees of
Heteroscedasticity etc), but even so, the challenges that it faces and the degree to which it succeeds in them is outstanding.
Note that on
Kaggle, more projects have been won by using something like
XGBoost, rather than
Either way, the developments in
learning make us really really like
And why do you need so many hidden layers? What is the
mathematical purpose of having them?
XOR problem is a classification task where a model should return a
True value if two inputs are different, and a
False if they are equal. …
Data Science for Sustainability is becoming of more important as the problem of climate change has been getting more attention. As the UN say, Data is the lifeblood of decision-making and the raw material for accountability.’ so it’s our responsibility to make sure we can make a framework to keep us accountable for keeping the world green.
Sustainability is oddly one of the most documented industries that’s ripe for Data Science. There are so many data points and experiments that are all weakly linked, but little has actually been done to unify them under a mass framework. Sure we have loads of experiments which correlated, but where’’s the direct causation? …
Artificial Intelligence has had a crazy couple of year and 2020 has been like no other. With a pandemic, a global recession along with incredible gains in AI, there’s simply so much to keep an eye on.
I studied my MPhil in Machine Learning in 2015 and back then, AI was in relative infancy but it was actually the perfect time to see the foundation of the industry that was to form.
Pre 2015, the theory was pretty sophisticated but the active day-to-day implementation of AI or Machine Learning was quite old. …