Lean Portfolio Management in Digital Transformation

Some organizations start their Digital Transformation considering the possibilities that technologies like artificial intelligence or cloud services could bring. However, Digital Transformation is not about the use of innovative technologies, technologies are only enablers. Digital Transformation should transform the customer experience and create new business models that must bring added value and must have concrete objectives.

📢 Technologies are only enablers. The #digitaltransformation should transform the customer experience and create new business models that must bring added value and must have concrete objectives.

Other organizations use digital products or computers to automate processes. Although it can help to reduce costs, that is only Digitization. Organizations must have a clearly defined digital identity, digital culture, and digital strategy. The result of a Digital Transformation should be new business models and a customer experience fundamentally different from the traditional ones. In many cases, to achieve this the traditional business processes and the company structure needs to be transformed too.

📢 Organizations must have a clearly defined #digitalidentity#digitalculture, and #digitalstrategy.

It is also very important to be aware that Digital Transformation creates a market environment that is volatile, uncertain, and complex. This creates big opportunities for organizations that are agile and can adapt to the new environment better than others.

Lean Portfolio Management aligns the organization’s strategy with execution in a lean and agile way. This is crucial to survive in the environment created by Digital Transformation.

In the digital era, it is crucial to quickly adapt and respond to competitive threats and efficiently validate and deliver customer value. Completing my Lean Portfolio Management certification helped me to learn new tools and methods for successfully managing Digital Portfolios.

Digital Transformation in the Power Industry

Thank you to the ⚡Energy Talks 🎙 team for inviting me to talk about the digital transformation🤖📈   #digitaltransformation

Digital Transformation in the Power Industry

“Digital transformation changes how to create and deliver value to customers.”

Machine Learning: Supervised Learning

The goal of this post is to present the most popular supervised learning algorithms.

There are two types supervised learning algorithms: regression and classification. Both of them have the goal to make a prediction based on input data provided during the training. In that input data we have information about the independent variables and dependent variables. The dependent variable is what the algorithm will predict later using new data.

Regression: The dependent variable is a number.

  • Linear Regression
  • Polynomial Regression
  • Support Vector Regression (SVR)
  • K-nearest neighbors (KNN)
  • Decision Trees & Random Forest

Classification: The dependent variable is a category.

  • Logistic Regression (Binary classification)
  • Naive Bayes
  • Support Vector Machine (SVM)
  • K-nearest neighbors (KNN)
  • Decision Trees & Random Forest

Supervised Learning

Big resolution image.

What I did the last 3 years

I had no time to write something for a while, the goal of this post is to summarize what did last years beside my job.

1. Master’s degree: Data Analytics and Business Intelligence

I decided to come back to the university to study Data Science I completed the following studies:

Master’s degree Business Intelligence 2018 – 2019 (Universitat Oberta de Catalunya – Spain)
– Gather and analyze information relevant to a company’s environment.
– Business oriented data analytics. (Customer and operations analytics)
– Use of procedures, skills, applications, tools and practices to support decision-making.

Post degree Data Analytics and Big Data 2016 – 2018 (Universitat Oberta de Catalunya – Spain)
– Data mining, data analytics and visual analytics.
– Data management, data governance and Big Data.
– Machine learning and artificial intelligence.

During that time I was involved in a couple of personal projects beside my job in OMICRON Electroncis.

2. Data Analyst hobbyist

I supported the eSport club x6tence doing data analysis for Clash Royale. To do this a webpage and a database SQLServer in Azure was used to do the data acquisition and PowerBI to do the Data Analysis.

This project was excelent to apply a lot of things that I was learning during my studies, special mention to PowerBI.

3. Software Development: Doctor Decks

Doctor Decks was a project in which I work together with a work colleague. The goal of the software was to find good combinations of cards for the game Clash Royale. The product was available in a web and also had phone apps: Windows universal, iOS and Android. A cloud app in Microsoft Azure was collecting the data from differents apis.

This was an excelent project to learn a lot about data analysis and improve my knowledge about Azure Cloud, web development with Angular and apps development with Xamarin.

The web had thousands of visits everyday and the app more than 1M downloads! After more than two years of success we decided to finish the project due to the maintenance effort.

 

ASP.NET Core 1 – Authorization using Policies

The goal of this post is to show how can we protect controller actions in ASP.NET Core 1 using Policies.

The whole code is available on GitHub: ASP.NET Core 1, Security using Policies.

With policies we don’t need to hard code anymore Roles or Names in our Authorize attribute. A policy is an authorization logic that contains one of more requirements.

How to use a policy?

The concept is very simple, once we have a defined policy we can add it to our Authorize attributes…

[HttpGet]
[Authorize(CookieMonsterSecurity.OnlyGoodMonstersPolicy)]
public IActionResult Info()
{
//... something that only good monsters can do
}

How to create a policy?

We have to define our policies in our Startup class, in ConfigureServices. We need a policy name, a list of valid authentication schemes and a list of requirements.

// Configure authorization
services.AddAuthorization(options => options.AddPolicy(CookieMonsterSecurity.OnlyGoodMonstersPolicy, policy =>
{
policy.AuthenticationSchemes.Add(CookieMonsterSecurity.CookieMonsterAuthenticationSchema);
// Our own requirement logic...
policy.AddRequirements(new IsGoodMonsterRequirement());
}));

We can add more than one requirement to our policy, there are some pre-build requirements:

  • policy.RequireAuthenticatedUser()
  • policy.RequireClaim(…)
  • policy.RequireRole(…)

But the more flexible way is to add a custom requirement, doing this we can write our own logic:

  • policy.AddRequirements(new IsGoodMonsterRequirement());

To write our requirement we use the base class AuthorizationHandler and implement the interface IAuthorizationRequirement.

This requirement checks that the user is authenticated and has the claim “MonsterTypeClaim” = “Good”

public class IsGoodMonsterRequirement : AuthorizationHandler<IsGoodMonsterRequirement>, IAuthorizationRequirement
{
protected override void Handle(AuthorizationContext context, IsGoodMonsterRequirement requirement)
{
Console.WriteLine("Is a good monster?");
if (!context.User.Identity.IsAuthenticated)
{
Console.WriteLine("... is authenticated...");
}
if (context.User.HasClaim(CookieMonsterSecurity.MonsterTypeClaim, CookieMonsterSecurity.MonsterTypes.Good))
{
Console.WriteLine("... and has the MonsterTypeClaim = MonsterTypes.Good!");
context.Succeed(requirement);
}
}
}