Probando Microsoft SQL Server en Linux

    Hace años, nunca hubiera pensado escribir un título como ese, pero las cosas cambian y Microsoft ahora está prestando atención a otros ambientes que no son los suyos. Esto nos ha dado la oportunidad de probar algunas de sus herramientas sin tener que vernos obligados a instalar Windows, lo cual se agradece mucho.

    Voy a instalar SQL Server en Linux (específicamente en Linux Mint). Aquí voy a detallar el proceso de instsalación a modo de tutorial y también voy a ejecutar algunas consultas sencillas para demostrar el uso de SQL Server.

    Durante el proceso de instalación, necesitaremos permisos de administrador, así que nos cambiaremos a nuestra cuenta root:

    Creamos un directorio destinado para herramientas y notas sobre SQL Server y nos cambiamos a él (este paso es totalmente opcional):

    Bajamos e instalamos las llaves de los repositorios de Microsoft:

    Añadimos el repositorio de Microsoft:

    Actualizamos nuestra lista de repositorios:

    Finalmente, instalamos SQL Server:

    Ejecutamos la herramienta de configuración:

    Verificamos que el servicio de SQL Server está ejecutándose correctamente:

    Ahora instalaremos las herramientas para la línea de comandos:

    Nos conectamos a nuestra instancia local:

    Creamos una base de datos de prueba y mostramos las bases de datos existentes:

    En nuestra nueva base de datos, creamos una tabla de prueba e insertamos un registro:

    Hacemos una consulta de prueba:

   Como se puede ver, SQL Server fue inesperadamente sencillo de instalar, configurar y utilizar, sobre todo teniendo en mente que utilizamos un sistema operativo que no es del propio Microsoft. En esta práctica aprendimos a crear tablas y hacer consultas con las herramientas de SQL Server, las cuales son muy parecidas a otras herramientas de otros RDBMS, como MySQL.


Alan Verdugo / 2017/08/18 / Uncategorized / 0 Comments

Getting out of the maze with A star

    A local IT company (who shall remain unnamed in this post and shall be thankful for that) was offering free tickets to this year’s Campus Party event in Mexico. To get the tickets, you needed to complete a programming challenge. Since I’ve never attended any Campus Party* and I enjoyed solving the programming challenge for the Wizeline event, I took some time to solve this one.

    Basically, the challenge was to find the optimal way out of a squared, bi-dimensional maze. Using an API, you registered for the challenge, requested a maze of size n by n, then you were supposed to find the optimal path to get out of it (using a program, of course) and then you had to submit your path as a list of (x,y) positions, starting at (0,0) and finishing at a position where a goal (designated by “x”) was placed. An example of a small 8×8 maze would be something like this:

    The zeroes are obstacles or walls, the ones are clear paths. So the solution in the previous example is obvious: (0,0), (0,1), (0,2), (0,3), (1,3), (2,3), (2,2), (3,2), (4,2), (4,3), (4,4), (5,4), (6,4), (7,4), (7,5), (7,6), (7,7).

    Here is another example:

    Which is still obvious. However, when I began to request mazes of bigger sizes I noticed the full complexity of the problem: There were many bifurcations and dead-ends. Of course, mazes are supposed to be confusing, that is the whole point of their existence. And not only that, I had to submit the shortest path from start to finish. That meant I could not use a brute-force method to find the goal by walking every possible path in the maze until I found it. The best part was that I had to solve a 1000×1000 maze to claim the prize.

    A 1000×1000 maze might not sound very big, but once you think about all the possible configurations in that space, you realize it is not an easy task. Thankfully, getting out of mazes is a very old problem, pioneered by Cretan kings who wanted to hide away their funny-looking stepsons. For that reason, a lot of smart people have spent a lot of time trying to find the best solution to such a problem, better known as the “shortest path problem”. Among those people was Edsger W. Dijkstra, a dutch mathematician and a computer scientist who rarely used a computer. Dijkstra is one of the elder gods of computer science and now spends his afterlife looking disapprovingly at students who use GOTO statements.

    In 1959, Mr. Dijkstra successfully designed an eponymous algorithm to find the shortest path between two points in any structure where there could be obstacles, varying distances, bifurcations, and dead-ends. This algorithm (or a similar algorithm, at least) is what mapping software uses to recommend trajectories (I believe Google maps uses Contraction hierarchies since they need and can pre-compute routes in order to improve execution times).

    One of these variations of Dijkstra’s algorithm is the A* algorithm (pronounced “A Star”). It was created in 1968 by Peter Hart, Nils Nilsson and Bertram Raphael, all of them Stanford scientists. A*, in turn, has many variations.

    So, I used an implementation of the A* algorithm to successfully find the shortest path in the 1000×1000 maze. I sent my solution and even when the API itself confirmed it was the optimal path to exit the maze, I never got the prize, not even a reply saying “Somebody solved it before you” or “We ran out of prizes”. I found that very unprofessional and irritating since the rules specifically said to send an email to a certain person notifying about the solution.

    Since the Campus Party is now over and I am still a little salty about being ignored, I uploaded my solution to my Github repository. It is a very quick and dirty solution, but it works, so don’t laugh too much (or better yet, improve it and create a pull request). Thankfully, I learned many interesting things and had fun doing this programming exercise so it was not a complete waste of time.


* The total number of actual parties I’ve attended tends to zero.


Alan Verdugo / 2017/07/12 / Uncategorized / 0 Comments

DevOps crash course at Wizeline


    A few weeks ago I was contacted by a friend who used to work with me. He told me that he was now working at Wizeline, a relatively new IT company. I had heard of them before and was intrigued. He also told me they would be sponsoring a two-day DevOps crash course and invited me to attend.

    To be honest, my interest in DevOps is minimal. I still think the area needs to become much more mature to get the recognition it deserves. Precisely for this reason is that I was interested in attending. I wanted to learn more about it and get a more informed opinion about the professionals in the DevOps world and what I can learn from them. Turns out, I can learn a lot.

    The course was part of the Wizeline Academy initiative. An effort from Wizeline to impart knowledge to their employees and to other people in the community. An effort that has been impressive so far. They have organized several courses (including free meals and swag for the attendees) and also sponsored meetups for the local communities of programmers, project managers, and DevOps professionals. Hopefully they will soon organize a data science course like the one they did in Mexico City.

    The lecturer was Kennon Kwok, a customer architect working at Chef itself, who regularly provides this course. I researched Kennon and was very impressed with his resume. Seriously, take a look at this.

    The most fun part of this was the application to the course. It was not enough to send your information. Since Wizeline’s offices are not that big and it was clear there would be much interest in the course, they implemented a “capture the flag” challenge to narrow the attendees to a final list of just 25. It was very similar of other challenges I’ve seen from IT recruiters. It involved following directions and clues that only led to more difficult clues. For example, most of the instructions were encrypted, so you had to decrypt them first and then understand the clue. I really wanted to explain the whole process and how I solved it, but I think it is better when people solve these things on their own. Besides, the challenge was not that hard. It had just the right amount of difficulty to keep it interesting but still make it challenging. To be honest, solving it was the most fun I’ve had in months. During the course I heard that hundreds of people applied, but only a few of us solved it all.

    A few days after I completed the challenge I was contacted by Wizeline and asked if I would be able to attend and also asked me a few questions just to confirm that my English was good enough to understand a native speaker like Kennon. At this point I was really excited about the course. The fact that I earned my attendance with the resolution of the challenge made me look forward to it. However, the challenge was much more aimed to programming and hacking skills, not DevOps.

    A couple of days before the course, the local DevOps meetup took place precisely at the Wizeline offices. One of the lecturers was Kennon. The other one was Basilio Briceño, another DevOps engineer who has a lot of experience. Attending this meetup was probably overkill since I already was going to attend the course in a few days, but I wanted to be there since I wanted to know the Wizeline offices before the training began. Besides, the DevOps meetup is organized by Emerson, who has been a great friend for years, and I also got to see many people who used to work with me. This meetup was sponsored both by Wizeline and Epam, which is surprising since both companies are competitors and lately have been in an unspoken war trying to hire as many people as possible. It is nice to see that rival companies can collaborate trying to improve the local community.


The course

    I arrived early and got a healthy breakfast that was provided by Wizeline. I realized we where almost 30 people attending the course and I only knew one of them. This was not surprising, after all, the challenge required some programming skills, and most people I know have gravitated towards the DevOps/Sysadmin path. There was also a Project Management group having breakfast with us. They were there for another of the Wizeline Academy courses.

    I won’t go into much detail about what we actually did in the course, but I will just mention that we played with recipes, berkshelf, resources, cookbooks, tests, virtual machines and containers. I think we all learned a lot, even people who already had some experience with Chef. Kennon was very patient but also challenged us to think outside the box and try our own solutions. The Wizeline DevOps team was also present, trying to learn from Kennon and serve as guides to the rest of us.



    At the end of the course we had a little “graduation ceremony” where we received a symbolic document about our accomplishment. We also got a free Wizeline t-shirt and a couple of stickers. There was also a little party with free drinks and snacks (which is something Wizeline employees enjoy permanently). Alas, I had to attend a concert so I had to leave early. You can read all this from Kennon’s point of view in his linkedin post.

    Overall, I am very satisfied about this experience. Everything was very enjoyable and interesting, from the selection process and the logistics, to the actual course and the aftermath. I could tell that a lot of effort went into this and at times I could not believe I was having all this for free. As I said before, hopefully Wizeline (and other companies) will continue organizing and sponsoring these events. It is obvious that it helps the IT community, which in turn raises the level of knowledge and abilities of all of us, and this is extremely beneficial to companies like Wizeline.


Alan Verdugo / 2017/07/04 / Uncategorized / 0 Comments

IBM Interconnect 2017

    I had the opportunity to attend IBM Interconnect 2017. This was my first time attending a serious technical conference and I must say that it was an excellent experience for many reasons.

    The reason I was sent to Interconnect in the first place was to show a proof of concept of a project I had been working on for the past few months. I might write about that project in a future post but in this post I will focus on the overall Interconnect experience and the things I learned during the event.

    As usual, the event took place in Las Vegas, more specifically, in the Mandalay Bay convention center. There was another event called IBM Amplify at the MGM Grand, but I did not have the time to attend (which is quite ironic since I was staying right there, at the MGM hotel).

    I had access to the venue since day zero, which means I got to see how everything was put together. As a matter of fact, I was extremely disappointed during the first day because I got to the convention center and I only saw a gigantic warehouse with many boxes in it. I was impressed by the ridiculous size of the convention center, but I was much more impressed when I saw how, in just a matter of hours, this almost-empty warehouse was converted into an actual technology conference, complete with booths, places to eat, enormous screens, a couple of mainframes and a few humanoid robots. That, and thousands of humans beings.

    I did have some free time every now and then, which I used to get lost in the crowd, grab some swag, speak with random interesting people and attend a few sessions. However, I missed some of the sessions that I wanted to attend, like Ginny Rometty’s keynote and Will Smith’s interview. Thankfully, replays are available at IBMGO and/or Youtube. The following are some of the things that I liked the most.


    IBM had on display a couple of their legendary Mainframes. They look impressive, mystifying and gigantic by today’s standards. A very friendly representative approached me and explained the advantages of the mainframes, which, according her, are far from dead. She gave me a quick tour of the components, mentioned that mainframe’s main focus on reliability and compatibility, that most Fortune 500 companies still rely on mainframes, and they are way more affordable than I expected. It was quite an eye-opening experience, learning that such old technology is still alive and well.

Recorded future

    To be honest, I had no idea what Recorded future was. I only got close to this booth to get a figurine of their super-cute mascot, Marty the Martian. However, once I was on their booth, Alex (whose last name I don’t remember) approached me and explained what Recorded future does: They basically crawl the web looking for intelligence about security treats, then, they use machine learning and all sorts of algorithms to warn their clients about vulnerabilities and exploits that could affect them in the near future. They are basically taking a proactive approach to IT security using analytics, which I think is a great idea.


    I spend some time talking with the IBM analytics teams, they where very friendly and answered all of my novice questions. In fact, they provided very useful recommendations about Watson Analytics and Data Science Experience.


    This was actually Canonical’s booth, but everything was branded with the Ubuntu logo. Ivan Dobos, a solutions architect, kindly explained to me how Juju works and its use cases. I was very impressed by Juju’s capabilities and it is something that I will definitively explore in the near future.

Phone chargers

    There were a couple of lockers where attendees could lock and charge their phones. A brilliant and very simple idea. Of course, this is not cutting-edge technology, but it was smart, useful and easy to use, which are three characteristics that are often forgotten while designing solutions to problems.

Bluemix server challenge

    Somewhere in a IBM office, somebody was faced with a critical problem: “How can we make videogames even more nerdy?” the answer is Bluemix Server Challenge, a VR game where you take the place of a heroic data center admin and pickup hardware which needs to be correctly placed into a rack. I did not have time to play it, but everybody absolutely loved it.


    During my days at the conference, I heard so many languages and saw so many faces. Technology truly is one of the few things with the ability to bring people together regardless of nationality, language or any preconceived “differences”. I was often reminded of those lines in the Hacker’s manifesto

This is it… this is where I belong…
I know everyone here… even if I’ve never met them, never talked to them, may never hear from them again… I know you all…

    I now understand that conferences of this size are better used as intelligence gathering points, where decision makers, innovators, thought leaders and futurists can get a first-hand idea of the technological trends that will inevitably influence the directions of the other industries in the following years. Even better, all this people can interact to generate more ideas.

    I hope I have another chance to attend Interconnect (or any other tech conference) again. More importantly, I hope I can continue attending while being paid for it. However, while the tickets could be seen as expensive, I am convinced that these conferences are invaluable if you take the time to attend labs, sessions and just try to engage in conversations with random people, after all, smart people from all over the world travel to attend and you never know who may be listening to your ideas, or whose ideas you could listen to.


Alan Verdugo / 2017/04/19 / Uncategorized / 0 Comments

Massively modifying images with ImageMagick

    Web editors often have the need to edit a large number of images. For example, the large image size of professional cameras tends to be overkill for most sites. I wanted a quick and easy way to resize large images and that was how I found ImageMagick.

    ImageMagick is a suite of tools and according to the man page, we can “use it to convert between image formats as  well  as  resize  an image, blur, crop, despeckle, dither, draw on, flip, join, re-sample, and much more”. First, let’s install imagemagick:

    Then, we can use the convert command to do the actual edition. Check the man page to see the astounding number of options this command has. For example, if I want to resize all the JPG images in the current directory to a width of 1280 pixels and save the resulting images as the same name but with “min-” before the name I would execute the following command:

    And here lies the advantage of ImageMagick: it can be used in a script to edit images extremely quickly. ImageMagick can also be used on Mac OS X and Windows. For more information about the convert command, refer to


Alan Verdugo / 2016/12/12 / Uncategorized / 0 Comments

1 2 3 5