Saturday, 25 August 2018

How to prepare for the DAMA CDMP exam? My experience...

Update

From the Dama website: "CDMP Status - On Saturday, August 25, 2018, the Board of Directors met for the first time as a full Board. One of the topics discussed was the current CDMP testing process.
Based on a) experiences setting up and holding the three chapter-hosted boot camps in the last month, and b) other stability issues with the CDMP testing platform and content, the DAMA I Board reluctantly decided to pause the sale of CDMP certification exams. (...) The board has committed to a 2-month timeline to identify and address current challenges, and determine a long-term solution."

Way cool from Dama to address known issues !!!

Check their page for the latest status: https://dama.org/content/cdmp-status

FAQ

Several people have contacted me with the same questions about CDMP, so here is a little FAQ:

How much time does it take to prepare for the CDMP exam? Assuming you have no experience with Data Governance and that you study about 2-3 hours per weekend, it should take about 8-10 weeks to prepare summaries and to memorize the concepts.

Is it worth it? Overall, I would say yes if you are seriously committed to working in data management and/or data governance. Read the original post below for more details.

------------------------------------------------------------

(Original post)

There is, unfortunately, not much information explaining how to prepare and pass the DAMA CDMP exam, so I'll share mine. My experience with passing this exam is pretty awful considering the price they are asking for and the low quality of the exam itself. (As of mid-2018) I think DAMA is doing a pretty bad job and I'll explain why with concrete examples.

I'll also explain how to prepare for this exam since the guidance on their website will point you in many unnecessary directions. For me, it looks like an invitation to buy more books and services than is really needed to pass the exam.

Is it worth it?

My short answer is: yes. My long answer is: yes, but I will not spend time and resources trying to pass higher levels of the certification (associate, practitioner, master, and fellow). Having nearly 20 years of experience in the software industry, I think the refinements of DAMA certifications are excessive and not really in line with the constant evolution of data management. However, the core concepts, frameworks, and knowledge available in the DAMA book are (most often) relevant. There are jewels in the dirt.

In other words, I recommend spending your time acquiring more hands-on experience, learning about technologies, participating in data projects and governance activities, rather than trying to get higher levels of certification. It will be more valuable for your career.

IMHO, the CDMP certifications could be simplified with two levels: core and advanced concepts. This would meet the needs of the industry.

About the exam questions

  • All questions are multiple choice questions. Most have 5 options but a couple are true/false questions.
  • There are 100 questions to be answered in 90 minutes.
  • Some questions have several possible correct answers, but there is only one "most" correct answer - I find this idiotic. There should be only one correct answer and 4 incorrect answers. It should not be a matter of perception or interpretation.
  • Several questions check whether you have remembered some definitions - They, unfortunately, do not check whether you have understood the concepts. It is a pity. Someone with a good memory could be certified without being competent or operational. It can weaken the certification value IMHO. 
When I passed the test, one question started with something like 'According to graph x in the DAMA book v1...'. I was outraged they would explicitly refer to an old version of the DAMA book when the 2nd edition is the recommended version.

Another question started with 'TO BE REMOVED' as if someone had been working on a new version of the test and did not finish their job.

Which topics to study

DAMA is doing a poor job at guiding those interested in passing the certification. They ought to provide a study guide or at least a precise list of topics to master for the exam like Microsoft does.

You will need to buy the DAMA book to study and pass the exam (600 pages). It is a reference book more than a real guide for data management. There is a lot of 'declarations' about how things should be (organizational principles), but it often comes short on explanations and justifications, or connections between theory and real business situations. Many terms are abstracts and often defined with other abstract terms, not to mention there are plenty of redundant and overlapping concepts. The authors fail to distill the key concepts and principles, which makes their work often hard to digest. In reality, things are cannot be that bureaucratic. It would be too expensive and inefficient.

The exam questions often focus on details. The good news is that you don't need to focus on all parts of the book. Here is my suggested study guide for the associate level certification:
  • Chapter 1, Data management - You must understand and remember ALL its concepts and schemas.
  • Context diagrams - You must understand all the concepts described in these diagrams and how they relate to each other.
  • Roles & responsibilities - You must be crystal clear about the types of individuals and groups participating in data governance and data management activities. What are their duties? When and why should they intervene in which activity?
  • General & high-level concepts - Each chapter introduces many concepts, sometimes with a high level of details or in very specific areas. Focus only on the high-level concepts which any software engineer or business analyst should master (for example 3FN, cardinality in relationships, SDLC...).
If you master all of the above, you are in a good position to pass the exam successfully. Beware of the DMBOK index, it is missing some entries. Take a look at the table of contents, list of figures and list of tables to create the set of concepts to study.

Last words...

Overall, I don't think this exam has been designed and reviewed by professionals. It seems written by consultants who have never been teachers or with no formal training or experience in education.

For an organization promoting professionalism and excellence, I think they need to walk the talk in this area first if they want to thrive as a universally respected and trusted organization.

Don't misunderstand me, the DAMA DMBOK contains a lot of valuable knowledge, even if some areas are a bit fluffy (by trying to be too exhaustive, you often lose substance or relevancy) or refer to too many buzzwords without providing proper definitions. It has become one of my reference books. It is a pity they have chosen a small font size with large interline space when a bigger font size with less interline space would improve readability for the same number of pages.

I think DAMA is trying to hard to be prescriptive and to control whether people remember concepts by creating an artificial certification hierarchy and by using inadequate methods, rather than focussing on the practicality of these concepts and their learning (lowering barriers to entry). The intention is good, the method is to be reviewed.

I would certainly value someone who has a DCMP certification more than someone who does not, especially when the job requires working with top management or on advanced business topics. It is less important if the job is highly technical, though valuable for sure.


Friday, 21 April 2017

Cloud Mining - How Much Passive Income Can You Make?

About Cryptocurrency Mining

Cloud mining is profitable, I have tried it. The question is, what is the return on investment and when will you get your money back? Considering mining fees, contract duration and many other factors, it's hard to guess which site offers the best cloud mining contracts. And what about reinvestment opportunities?

Well, rather than guessing, I've spent a bit of my money to find out the real return of such contracts. I have used Hashflare.io and Genesis Mining, which I trust most. Their helpdesk was responsive when I had questions. Recently, I have started working with MyCoinCloud too.

This post is about sharing my observations.

Warning: Cryptocurrencies have been very volatile since late 2016. The figures I am about to share should be interpreted with caution. I have seen a lot of fluctuations. These numbers may not be valid anymore in a couple of days, weeks or months. I will update them from time to time (last update: July 9th, 2017).

Cloud Mining Comparison

Contract prices are not included below, since they often change according to market conditions. The profitability indicates the observed amount of coins produced per day for a given processing power and after fees deduction (i.e., what comes back in your pocket).

Break-even is the estimated amount of time required to get your investment back. In other words, if you put 1$ in mining contracts, how much time does it take to get your 1$ back? This figure is probably the most volatile one, as plenty of factors influence it. I'll describe them later in this post.

Break-even is computed according to current contract price, excluding any promotions, coupons or discount for bulk buying. Max is the maximum observed break-even since I have started investing in online mining contracts.

When I initially wrote this post (which I update here and there), I provided data for fixed-term mining contracts. I have stopped monitoring these and decided to focus on lifetime contracts only. Therefore, the data provided below is only for lifetime contracts.  

Lifetime Mining Contract Profitability

Currency Company Profitability Power Break-Even (Max)
Bitcoin Hashflare 0.00019200 BTC  1 TH/s 8.2 months 19 months
Litecoin Hashflare 0.00003284 BTC 1 MH/s 5.4 months 17 months
Bitcoin Gen-Mining 0.00029004 BTC 1 TH/s 6.8 months 16 months
Ethereum MyCoinCl. 0.00045266 ETH 1 MH/s 5.9 months 7 months
ZCash MyCoinCl. 0.00334300 ZEC  100 H/s 6.6 month 7 months

Genesis Mining pays on a daily basis. MyCoinCloud pays on a weekly basis. For Hashflare, payment transfers are manual.

Which Factors Influence Mining Profits?

  • Mining Difficulty - This is a parameter influencing the productivity of mining servers. The more people are mining a cryptocurrency, the higher its difficulty. The higher its difficulty, the more effort is required to produce a coin, and vice-versa. This parameters helps regulating the production of coins. It tends to follow the price fluctuations of cryptocurrencies, with some delay.
  • Contract Price - The general trend is an inverse correlation with a coin's mining difficulty and a positive correlation with a coin's value against a traditional currency (say USD). Since more and more people are mining coins, the mining difficulty rises. In order to keep their offer valuable, companies lower their contract price. There is a notorious exception. Litecoin contracts at Hashflare.io have gone from 9.9$ to 6.5$, then up to 13.5$ due to the Litecoin breakthrough in the first half of 2017.
  • Mining Fees - Fixed term contracts tend to have no mining fees, as they are already priced into the contract. However, lifetime contracts have a daily fee per computing power to cover for the electricity (amongst others). Old mining material tends to consume more electricity than recent one. Typically, lifetime contracts produce coins as long as they are profitable. Then, the material is decommissioned.
  • Pool Fees - Some companies offer the possibility to select mining pools. Having performed some tests, I did not notice significant differences between them, except for the smaller ones whose profitability is more unpredictable and sometimes lower. I recommend avoiding pools not clearly publishing their fees.
  • Cryptocurrency Value - Bitcoin has seen its value rise from less than 700$ to more than 1200$ in about 4 months (Nov. 2016 to Feb. 2017). Since mining contracts produce cryptocoins, this factor is the most influential regarding profitability measured again traditional currencies. It can also heavily influence the mining difficulty.
  • ASIC Electronic Cards - These are electronic components designed for the sole purpose of mining cryptocurrencies. Each cryptocurrency is uses a given algorithm. Some of these can be implemented into ASIC cards in order to boost processing power and to reduce electricity consumption (Bitcoin, Litecoin, Dash). However, this is not (or hardly) possible for other currencies (Monero, ZCash, Ethereum). For the former, this means mining material becomes obsolete faster, while difficulty often rise faster to regulate production. For the latter, the corresponding difficulty does not fluctuate much.
  • Halving - Some cryptocurrencies see their mining reward halve from time to time (Bitcoin likely in June 2020, Litecoin in August 2019, Zcash in October 2020). These dates can only be estimated. Halvings put a sudden stress on older material profitability. For other currencies, Dash sees a mining reward decrease of 7% per year, while Monero sees a small decrease after each block. 

Warnings & Recommendations

  • Online mining is not the only way to invest in cryptocurrencies - If one believes the value of a currency will rise, one may as well buy some and wait, rather than invest in mining contracts. Trading coins has been more profitable than mining contracts between November 2016 and June 2017, thanks to a spectacular rise. However, this rise has reached a plateau and future rises are unlikely to be as sharp.   
  • Coupons and promotions mitigate risks - A 10% or 15% coupon has a dramatic impact on break-even. Use them to mitigate the risk of a constant rise in difficulty and/or decrease of currency value (in USD for example). However, be careful. If you Google for some coupons for, say, Genesis Mining, some advertising mentions between 3% and 10%, while in reality, it is only 3% and they know it.
  • Short-term vs Long-term - The global trend is up for the most important cryptocurrencies. Trading their value makes sense, especially because of the high volatility. Mining contracts are not the best option for short-term objectives, but IMHO, they excel at mid to long term objectives. They provide profitability with peace of mind. You don't need to spend all your time in front of your laptop, chasing for trading opportunities.

Reinvesting In Mining Contracts

Hashflare.io offers the possibility to automatically (or manually) reinvest produced coins into extra mining contracts. I did the maths for fixed term contracts but I don't see any value here, especially since it extends the break-even period. I am not saying there is no possibility for profits, but the extra risk is not worth it IMHO. Greed has already wiped out so many investors, I don't want to be the next one the list...  

For lifetime contracts, it's a different game. After an investment period (say 1 year), you still hold some processing power and that has a value. I did some maths and computed the cash flow value of production with a 60% yearly discount (i.e., if it is worth 100$ now, it will be worth 100 * 40% = 40$ the next year, and 40 * 40% = 16$ the following year, etc...). 60% might seem high for some, but remember about halving and the constant rising of difficulty for some cryptocurrencies. I would rather be conservative and safe, than sorry.

Well, the outcome is that even with a high discount, reinvestment in lifetime contracts offers pretty good value. I see two strategies for beginners here: the very safe approach by which one does not reinvest anything before reaching break-even, and the cautious approach which is to not reinvest more than what you have already recovered. Say you have invested 100$ and regained 35$, you would not reinvest more than 35 / 100 = 35% of future coins produced by your processing power.

Now that I have recovered my investments, I am using a full re-investment strategy, since I am interested in maximizing long-term benefits.  

How To Get Started With Cloud Mining?

One issue to tackle are wallets. It is technically complicated to hold them on your laptop. Considering I mine several currencies, I have found Cryptonator to be a good solution (but with some caveats, read the warning **). Although Cryptonator offers cryptocurrency conversions, I found Changelly to offer better conversion rates. For my Bitcoin wallet and for conversion to EUR and SEPA wire transfers, I use Bitwala.

Then, buy your first online mining contract either at Hashflare.io, Genesis Mining or MyCoinCloud too (*). You can shave off 3% of the purchase price at Genesis Mining by using this permanent coupon: 6M1WUC. Hashflare.io offers temporary coupons from time to time. These are published on their website and on their Facebook page.

If you enjoyed this post, please share it or like it !!! Thanks !!!

(*) Disclaimer: I do participate in the affiliation programs of Hashflare.io, Genesis Mining, Cryptonator, Bitwala and Changelly. If you register with the links in this post, both you and I will get some benefits (sometimes immediate, sometimes deferred, sometimes after doing some business with them).

(**) Warning: Some of my Bitcoin transactions with Cryptonator have been waiting for confirmation for several days now because the transfer fee was too low (***). Unfortunately, Cryptonator does not let one set its own transfer fee. This resulted in failed EUR SEPA transactions too. I had to insist to finally receive the money on my bank account.

(***) Bitcoin has grown so popular that the system has been having issues to process all transactions in a timely manner between May and June 2017. For several months, involved parties did not agree on the solution to this looming issue. Recently (May 2017), they came to a agreement.

Sunday, 14 August 2016

Docker Concepts Plugged Together (for newbies)

Although Docker looks like a promizing tool facilitating project implementation and deployment, it took me some time to wrap my head around its concepts. Therefore, I thought I might write another blog post to summarize and share my findings.

Docker Container & Images

Docker is an application running containers on your laptop, but also on staging or production servers. Containers are isolated application execution contexts which do not interfere with each other by default. If something crashes inside a container, the consequences are limited to that container. There is a possibility to open ports in a container. Such containers can interact with the external world via such ports, including other containers having opened ports.

You can think about a Docker image as a kind of application ready to be executed in a container. In fact, an image can be more than just an application. It can be a whole linux environment running the Apache server and a website to test for example. By opening port 80, you can browse the content as if Apache and the website were installed on your laptop. But they are not. They are encaspulated in the container.

Docker runs in many environments: Windows, Linux, Mac. One starts, stops and restarts a container with docker using available images. Each container has its private file system. One can connect and 'enter' the container via a shell prompt (assuming the container is running Linux for example). You can add and remove files to the container. You can even install more software. However, when you delete the container, these modifications are lost.

If you want to keep these modifications, you can create a snapshot of the container, which is saved as a new image. Later, if you want to run the container with your modifications, you just need to start a container with this new image.

In theory, it is possible to run multiple processes in a container, but it is not considered a good practice.

Docker Build Files & Docker Layers

But how are Docker images created in the first place? In order to create an image, you need to install Docker Composer on your laptop. Then, in a separate directory, you'll create a Dockerfile file. This file will contain instructions to create the image.

Most often, you don't create an image from scratch, you rely on an existing image, for example Ubuntu. This is the 1st layer. Then, as Docker Compose processes each line from Dockerfile, each corresponding modification creates a new layer. It's like painting the wall. If you start with a blue background, and then paint some parts in red, the blue disappears under the red.

Once Docker Compose has finished its job, the image is ready. A Docker image is a pile of layers (in other words). Each time you launch a container, Docker simply copies the composed image in the container for execution. It does not recreate it from scratch.

Docker Volumes & Docker Registry

A Docker registry is simply a location where images can be pushed and stored for later use. There is a concept of version and latest image version. There is a public Docker repository, but one can also install private registries.

A volume is a host directory located outside of a Docker container file system. It is a mean to make data created by a container in one of its directory available in the external volume directory on your laptop. There is a relationship created between this inner container directory and the external directory on the local host. A volume 'belonging' to a container can be accessed by another container using proper configuration. For example, logs can be created by one container and processed by another. It is a typical use of volumes.

Contrary to containers, if a container is erased, the data in its volume directory is never explicitly deleted. It can be accessed again later by the same or by other containers.

There is also a possibility to mount a local host directory into a container's directory. This will make the content of the local host directory available in the container. In case of collision, the mounted data prevails on the container's data. It's like a poster on the blue wall. However, when the local host directory is unmounted, the initial container data is available again. If you remove the poster, that part of the wall is blue again.

But, Why Should I Use Docker?

Dockers brings several big benefits. One of them is that you don't need to install and re-install environments to develop and test new applications, which saves a lot of time. You can also re-use images by building your images on top of giants. This also saves a lot of time.

However, the biggest benefit, IMHO, is that you are guaranteed to have the same execution environment on your laptop as on your staging and production server. Hence, if a developer works under Windows 10 and another on Mac, it does not matter. The mitigates the risk of facing tricky environment bugs at runtime.

Hope this helped.

Saturday, 26 September 2015

Explain React Concepts & Principles, Because I Am Not A UI Specialist

I have been reading React's documentation, but found it to take too many shortcuts regarding the descriptions of concepts and how they related to each other to understand the whole picture. It is also missing a description of the principles it relies on. Not everyone is already a top-notch Javascript UI designer. This post is an attempt to fill the gaps. I am assuming you know what HTML, CSS and Javascript are.

What Issues Does React Try To Solve?

Designing sophisticated user interfaces using HTML, CSS and Javascript is a daunting task if you write all the Javascript code by yourself to display, hide or update parts of the screens dynamically. A lot of boilerplate code is required, which is a hassle to maintain. Another issue is screen responsiveness. Updating the DOM is a slow process which can impact user experience negatively.

React aims at easing the burden of implementing views in web applications. It increases productivity and improves the user experience.

React Concepts & Principles

React uses a divide and conquer approach using components. In fact, they could be called screen components. They are similar to classes in Object Oriented Programming. It's a unit of code and data specialized in the rendering of a screen part. Developing each component separately is an easy task, and the code can be easily maintained. All React classes and elements are implemented using Javascript.

Classes & Components

With React, you will create React classes and then instantiate React elements using these classes. React components can use other React components in a tree structure (just like the DOM structure is a tree structure too). Once an element is created, it is mounted (i.e. attached) to a node of the DOM, for example, to a div element having a specific id. The React component tree structure does not have to match the DOM structure.

No Templates

If you have developed HTML screens using CSS, it is likely you have used templates to render the whole page or parts of it. Here is something fundamentally different in React: it does not use templates. Instead, each component contains some data (i.e., state) and a method called render(). This method is called to draw or redraw the parts of the screen it is responsible for. You don't need to compute which data lines were already displayed in a table (for example), which should be updated, which should be deleted, etc... React does it for you in an efficient way and update the DOM accordingly.

State & Previous State

Each component has a state, that is, a set of keys and values, also called properties. It is possible to access the current state with this.state. When a new state is set, the render() method is called automatically to compute parts of the screen which have to be updated. This is extremely useful when JSON data is fetched with an Ajax. You just need to set it in corresponding React components and let React perform screen updates.

JSX & Transpilation

Creating a tree of React UI components using Javascript means writting lengthy-ish code which may not always be very readable. React introduces JSX which is something between XML/HTML and Javascript. It provides a mean to create UI component trees with concise code. Using JSX is not mandatory.

On the downside, JSX need to be translated into some React-based Javascript code. This process is called transpiling (as opposed to compilation) and can be achieved with Babel. It is possible to preprocess (i.e., pre-transpile) JSX code on the server side and only deliver pure HTML/CSS/Javascript pages to the browser. However, the transpilation can also happen on the user side. The server sends HTML/CSS/Javascript/JSX pages to the browser, and the browser transpiles the JSX before the page is displayed to the user.

That's it! You can now dive into React's documentation. I suggest starting with Thinking In React. It provides the first steps to design and implement React screens in your applications. I hope this post has eased the React learning curve!

Monday, 14 October 2013

Creating An OpenShift Web/Spring Application From The Command Line

OpenShift offers online functionalities to create applications, but this can also be achieved from command line with the RHC Client Tool. For windows, you will first need to install RubyGems and Git. The procedure is straightforward.

Git SSL Communication 

OpenShift requires SSL communication between local Git repositories and the corresponding server repositories. Generating SSH Keys for TortoiseGit on windows can be tricky, but this post tells you how to achieve this.

Maintenance

From time to time, run the following commands for RubyGems and RHC updates:

> gem update --system
> gem update rhc

Creating the Spring application

Under Windows, open a cmd windows and go to the directory where you want to create the application. Assuming you want to call it mySpringApp, run the following command:

> rhc app create mySpringApp jbosseap-6

The application will be automatically created and the corresponding Git repository will be cloned locally in your directory.

'Unable to clone your repository.'

If you encounter the above error, you will need to clone the Git repository manually. Assuming that TortoiseGit for Windows has been installed properly and that you have generated your SSL keys for Git properly, right-click on the directory where you want to clone the Git repository:

OpenShift git cloning manually

Enter the SSL URL in the first field (you can find itL under 'My Applications' in OpenShift). Make sure you check Load Putty Key and that the directory points at you .ppk file.

This solution has been made available on StackOverflow too.

Making it a Spring Application

To transform the above application into a Spring application, follow instructions available here.

If you cannot execute Git from the command line, it is most probably not in the classpath. You will need to add it and open a new command line window.

That's it, you are ready to go. Open the application in your favorite IDE. Don't forget to (Git) push the application to make it accessible from its OpenShift URL.

Tuesday, 1 October 2013

September 21-22-23, 2013 - Search Queries Not Updated in Google Webmaster Tools

This week-end, many people have started reporting the same issue in Google's Webmaster Forum: no more daily search queries information updates. For most, the data reporting stopped on September 23, 2013, but I have observed this since September 22, 2013.

Yesterday, a top contributor has announced that this issue had been "escalated to the appropriate Google engineers". He mentions this issue started on September 21st. Therefore, it has been 9 days before someone could confirm that Google is aware of it. Google Webmaster Tools (GWT) is known to lag 2 or 3 days behind when it comes to search query data, which explains why most webmasters only started to ask questions at the end of last week. This issue made the headline of Search Engine Roundtable too.

In the confirmation post, a link to a 2010 video has been posted. Matt Cutts discusses which types of webmaster tools errors should be reported to Google. He mentions that Google engineers are a bit touchy when they are asked whether they monitor their systems. So did Google knew about this issue since September 21st and deliberately decided not to answer posts in the Webmaster Tools forum for 9 days or did they just miss it, because it was not monitored?

Many people have been hit by the recent Panda updates. August 21st, September 4th and more recent dates have triggered a lot comments in forums. Many websites lost all their traffic without any explanation. No message in GWT, no manual penalty, nothing. Some of these sites were using plain white hat SEO. Webmasters working hard to produce quality content need GWT search query data feedback, especially when they believe some of their sites have been hit by recent updates. It helps them find out whether they have implemented the proper corrections or not.

On September 11th, a new Matt Cutts video was posted about finding out whether one has been hit or not by Panda, and whether one has recovered from it or not. Unfortunately, it does not contain clear cut information answering the question. This video only confirms that Panda is now integrated into indexing and that one should focus on creating quality content. Google's interpretation of quality content is still vague, yet they have implemented algorithms to sort web pages.

If there is a bug impacting customers using their service, why isn't Google officially open and communicative about it? This has been an ongoing complaining from webmasters. I can understand that Google does not want to give too much information about their systems. They don't want hackers too exploit these against them. However, it clearly seems that the focus is more on not communicating with hackers than communicating openly with regular webmasters. Is Google on the defensive mode?

Google is capable of algorithmically detecting when a website (or some part of a website) has quality issues. It does not hesitate to penalize such websites. Then, why doesn't Google communicate automatically about these issues to regular webmasters in GWT? It is algorithmically possible and scalable too. Google is not the only party interested in creating quality websites. It is in the interest of regular webmasters too. Of course, hackers would try to exploit this information, but overall, if regular webmasters had this information too, they would create better content than hackers too. Users would still sort between good and bad websites, not only Panda.

Sometimes, it really seems like Google does not truly want to collaborate with regular webmasters. I notice selective listening followed by monologues. Ask me questions and I'll answer them. I won't acknowledge any flaws, but I'll secretly work on these so you can't poke me again. This is not a collaborative dialogue, it is a defensive attitude. I believe that acting with excessive caution directly hampers the achievement of one's own objectives.

My strong opinion is that if Google solved this communication issue, it would bring much more return than any other stream of tweaks to their Panda algorithm. Give people the information they need to do a good job, empower them, trust them. Right now, the level of frustration is pretty high in the webmaster community. Frustration leads to lack of motivation. Lack of motivation decreases productivity. No productivity means not a chance to see new quality content or improvements.

There is a needless vicious circle and Google can do something about it, for its own good too.

Monday, 9 September 2013

Best Responsive Design Breakpoints

While trying to find an answer to my own question: "What are the best responsive design breakpoints?", I have performed a small statistical study over SmartPhone screen widths (portrait and landscape) using information provided by i-skool.

SmartPhone Screen Width Study (Portrait & Landscape)

The above table shows how often a specific SmartPhone width in the source data is reported. There are five peaks:
  • 320 pixels
  • 480 pixels
  • 768-800 pixels
  • 1024 pixels
  • 1280 pixels
These look like good responsive design breakpoint candidates.

Best Google Ad Formats

Google offers several ad formats. Assuming the following breakpoints, here are examples of adequate ad formats with regards to width:
  • 320 to 479 pixels - Mobile Leaderboard (320x50), Half Banner (234x60), Medium Rectangle (300 x 250)
  • 480 to 767 pixels - Banner (468 x 60), (468x15) Displays 4 links
  • 768 pixels and above - Leaderboard (728 x 90), (728x15) Displays 4 links