There is a large amount of emphasis that is put into how to search for and find a job as a software developer, site reliability engineer, and other tech positions. However, finding and landing a job are a very small part of a persons technical career. Rarely do we spend the time thinking or discussing how to manage the other aspects of a career in tech. And, by managing, I mean to take an active role in ensuring that you are looking out for yourself and not just the company. I am not saying that you should not do your best at work, but at the end of the day it is only a job.
This is the start of a series. I was going to make this a single long entry, but as I started working through the list of topics, it occurred to me that there is more than enough information to split up the posts over a series of weeks.
Before diving into the list of topics, let us take a moment and consider why it is that you should take a proactive role in your tech career. Previous generations used to work at companies for 20 to 30 years. I am not saying the people don’t still do this, but from my experience the days of staying at a company or in a single position for 20 years is largely over. This is especially true in the tech industry where I have seen the average time in a position of 3 to 5 years. This is purely based on anecdotal evidence and my own experiences.
Beyond the tech aspect, is the fact that companies are no longer loyal to their employees. I like my current job and my current employer. It is a good match. But, if they had to save my by reducing headcount and I was on the wrong team or in the wrong location, they would let me go and not blink twice. Oh, the managers might feel bad, but at a corporate level, I would just be a number. In the end, we all are.
I do not know about you, but school never taught me much, if anything, about managing my career. There were classes and on campus career services that would hold job fairs and help with writing and reviewing a resume, but that is about it. Just like personal finances, there are no lessons given to help navigate the next 20 or 30 years of your life.
Over the next few posts, I will be talking about a number of topics related to managing your career. These are geared in 1 of 3 ways. The first is going to be general information. This applies to everyone in the tech field, and is not specific to if you are an individual contributor (IC) or manager. The second area is for ICs, and is specific to the items and realities that must be considered. Last but not least is a section for managers. The manager section will shift a bit, part of the focus will be on managing others, but also moving forward in the manager position.
I have put off publishing this post because I was not sure if I wanted to include the list in this post. Right now, I am going to leave it out. Mainly, because I want to ensure that I leave myself room to change topic order and what comes next. Instead of doing sections, I may skip around between IC, Manager, and General discussion.
And, to be fair, I might not put forth thoughts that are correct. They are just my opinion. But, when I start writing, I am going to move forward with my views and go with what I have experienced. Your views may differ, and I understand that.
As a sneak peak, here is topic one. “You are responsible for your career, not your boss or your company”
]]>The machine that I do most of my work on is my Lenovo Carbon X1 Laptop running PoP_OS!. But, because of reasons that I will not go into today, my main desktop runs Windows 11 and on that I run Windows Subsystem for Linux 2 (WSL2). WSL2 has a few Linux distros that you can run on it. For simplicities sake, I have gone with Ubunutu
Right now I am working with Jekyll to write and update a blog, and as such I want to be able to jump between machines when doing development. That means that they both need to have Ruby installed and be able to run the dev server.
As the distro is pretty much just a regular Ubuntu install, there should be no differences in getting this setup. The real issue is that I like to have my own references for the future, and I will know where this will be. No searching for a tutorial that may or may not work.
Enough of the idle chat. This is not a cooking blog with 16 pages about Autum days before getting to the meat of things.
To manage Ruby versions we are going to use rbenv. There are other ruby version managers out there, so if you want to go with them, stop reading and find another tutorial.
Note: you do not have to install Homebrew. But, this is the recommended way of installing rbenv, and in the long run is easier to keep up to date.
To start with update your system. It is just good practice. Then, install Homebrew.
$ sudo apt update
$ sudo apt upgrade
$ /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
After the install completes, it is recommended to install build-essentials and gcc. Here are the docs for Homebrew-on-Linux.
$ sudo apt-get install build-essential
$ brew install gcc
With Homebrew installed this part is super simple. The first step is to install rbenv and ruby-build. Ruby-build will build the ruby versions, and trust me, you want it installed. The rest of this is useless if you skip that step.
$ brew install rbenv ruby-build
...
==> Caveats
==> ruby-build
ruby-build installs a non-Homebrew OpenSSL for each Ruby version installed and these are never upgraded.
To link Rubies to Homebrew's OpenSSL 1.1 (which is upgraded) add the following
to your ~/.profile:
export RUBY_CONFIGURE_OPTS="--with-openssl-dir=$(brew --prefix openssl@1.1)"
Note: this may interfere with building old versions of Ruby (e.g <2.4) that use OpenSSL <1.1
$
The next step is to add rbenv to be part of the shell by default. For this to work we have to update the .profile of the account. You can do this by hand, or just add a single line that will take care of it every time that a shell is launched.
I am going update to use the Homebrew version of OpenSSL because I am using a more current version of Ruby. So, the next steps that need to be perfomed when using the bash shell is to add that command to the profile file. Now, I will always go in and clean this up. But this is the easy way to do it. Also, I will source .profile again so that you do not have to restart the shell.
$ echo 'eval "$(rbenv init - bash)"' >> ~/.profile
$ echo 'export RUBY_CONFIGURE_OPTS="--with-openssl-dir=$(brew --prefix openssl@1.1)"' >> ~/.profile
$ . ~/.profile
At this point, you have everything installed that you need. To ensure that rbenv and ruby-build are installed, you can run the following commands.
$ rbenv global
system
$ ruby-build --version
ruby-build 20220726
By installing rbenv via Homebrew, you should have a version of ruby installed. For me, this was version 3.1.2. Depending on when you read this, your version could be different. If the version installed works for you, skip the rest of this step and go to the next one on how to configure rbenv.
There are a few commands that are used to list installed versions, list available versions, and then install a Ruby version.
$ rbenv versions
3.1.2
$ rbenv install -l
2.6.10
2.7.6
3.0.4
3.1.2
jruby-9.3.6.0
mruby-3.1.0
picoruby-3.0.0
rbx-5.0
truffleruby-22.2.0
truffleruby+graalvm-22.2.0
Only latest stable releases for each Ruby implementation are shown.
Use 'rbenv install --list-all / -L' to show all local versions.
$ rbenv install 3.0.4
Downloading ruby-3.0.4.tar.gz...
-> https://cache.ruby-lang.org/pub/ruby/3.0/ruby-3.0.4.tar.gz
Installing ruby-3.0.4...
ruby-build: using readline from homebrew
Installed ruby-3.0.4 to /home/username/.rbenv/versions/3.0.4
This is the most vital part to using rbenv, being able to setup a global and local version of Ruby to use. Global means that it is available to the system in general. Local indicates that it belongs in a subdirectory, and when operating there, that version will be used.
This allows for gems to be configured on a project by project basis.
$ rbenv versions
* 3.0.4 (set by /home/username/.rbenv/version)
3.1.2
$ rbenv global 3.1.2
$ rbenv global
3.1.2
$ ruby --version
ruby 3.1.2p20 (2022-04-12 revision 4491bb740a) [x86_64-linux]
$ ruby --version
ruby 3.1.2p20 (2022-04-12 revision 4491bb740a) [x86_64-linux]
$ rbenv local 3.0.4
$ ruby --version
ruby 3.0.4p208 (2022-04-12 revision 3fa771dded) [x86_64-linux]
$ cat .ruby-version
3.0.4
It is now as simple as that. When you move into that subdirectory, the version of Ruby will automatically switch to the version specified. This can be used in projects, and added to repos to ensure that the versions are the same accross systems.
Happy Coding
]]>My tech blog is once again live. And, I have to say that it took me longer than I thought it would to switch over from WordPress. I will get into why I moved later, (that is a topic for another day), but it still amazes me how complex things are to get up and running and configured. For this post, that is where I am going to spend my time, talking about the complexities involved in setting up a simple website.
For the majority of companies and people, a hugely interactive website is massive overkill. Wait, let me think about it…. Nope, this is absolutely true. What used to be a server side rendered informational kiosk has now become a complex nightmare of JavaScript and client side rendering. This holds true mostly for business sites, but more and more individual sites are falling into the same trap, making something complex because you can, not because you should.
I remember in the early days of the web, a large number (I don’t know how large), were nothing more than straight HTML. For a store that has 5 pages about them and their services, this was all they really needed. The extent that they needed any automation was a form for ‘contacting us’ that had to send an email. Many of these were built by hand, dreamweaver, or even Microsoft FrontPage. There was no backend database, there was not automated configuration. It was simple. And, uploading it was simple as well, simple FTP and you were done.
I will admit that some of the old websites were painful to look at. But, at the same time, anyone with a text editor and ftp could create and get a site up and running. But, as with all things, times have changed. We have moved into an age where all the square edges have to be rounded, and visual looks take precedence over quality information or functionality.
At the same time, there are some authors that have kicked the trend. Now that I have said that I need to go find some of them. And, using Jekyll, or a dynamically generated static site is a step in that direction. It is simple from the view standpoint, but even getting a simple Jekyll website up is not for a layman. Instead, it takes knowledge of installing Ruby, configuration files, and html or templating.
Now do not get me wrong. I am not going back to writing this website in straight HTML with CSS. I have moved on from there. There is a reason one of the largest used pieces of software on the internet is WordPress. For the majority of use cases, it just works. It makes it easy for someone that is not proficient at code to get a website up.
The downside to this complexity is increased cost and performance. I am curious what it is going to cost me to run this website. I am running it on S3 behind CloudFront. I could have used a 3rd party, but I wanted to see if I could do it, and I did. I do not have an automated build or infrastructure yet, but that will be coming soon. But, I expect my bill to be less than $5 a month. And, it should load fast.
Now, with something like WordPress, you can get a starter price for a year at about $7 or so a month. And, depending on the company and the shared server you are on, the performance can be fine, or there can be up to a 3 or 4 second delay for load times. This is because that frontend has to hit a DB for every single page load. With all those moving parts, there is a chance for any component to be slowed down.
I don’t know what my original point this post was supposed to be. Just getting this new (old) site published was a pain, and getting analytics setup was less than optimal.
Overall, things have gotten better. But, maybe once in a while, we should just stop and ask, how can we keep it simple.
]]>At one point I had a blog. It was never a large blog, nor did I haver have a huge following, but I had a site up, and I would put in periodic updates. It was focused on technical topics, and I would post about various topics that interested me. However, maintaining a wordpress site was a bit much. It was not much, but the plugins are a pain, and I just was never a fan of using all the tools for writing. It became more of making posts look pretty than focusing on the topic at hand.
That brings me to today. Here I am kicking off a new site using Jekyll. I am going to see about importing my old blog entries, but, that might be a while. The main goal is to get a new site up and running and to start writing again. That is the main point for today. You have to start somewhere. For me, that is getting this first post up, having a very bad layout, and getting something back out on the web.
That remains to be seen. I am probably going to be sticking to a variety of technical topics. Now, the topics will be diverse, as they will be based upon my interests, and those interests can be all over the place. I think the first thing that I am going to dive into is getting this deployed and how to integrate it with automation and build tasks. I really enjoy automating the boring stuff, and doing it in such a way that is repeatable.
Hopefully, with more writing, my flow will return. As of right now, all of my writing feels very disconnected. I have noticed that this is what happens from taking time off from writing. Speaking is easy, but getting words down on the page, and having them make sense can be a challenge. Then again, some people say the way that I talk is confusing. Maybe I am just confusing and all over the place.
In addition to writing about projects I am working on, I may dive into areas that I have a deep interest in. One of those areas is System Engineering and Architecture. Another area is software design. And, looking at these two topics, I am sure that there are novels that could be written about both. I am here to say that I doubt I could write a novel about either topic. Mainly because I think I would get bored before finishing it, and it would probably be out of date before it was done.
There is probably a high likely hood that this post will get published en masse with the ones about getting my site posted and online. That is because, I want to be able to push this easily without having to think about it. So, that means, this could get posted in the past and nobody will ever read it. That is ok. Part of writing this is for myself, and part of it is for others.
Currently, the site kinda looks like junk. There are no sections, and the layout is a bit meh. But, I am of the mind that let me start getting content up over spending 6 months trying to make it look “pretty.” Don’t get me wrong, I would love for it to look great and be functional, but I will take functional and ugly over non-existent any day of the week.
The last note is that I used to have a blog, and I am looking at pulling in the content. That should be interesting as I did have some images and such on the pages. As it is, I am going to have to figure out how to add media here as well.
With that, I am going to sign off, and start looking at how to get this up and running online, and behind SSL.
]]>And, I blame this on the search engines. These sites are the reason that they are still in business. They advertise with them, and use understandings of the algorithms to show up at the top of searches. My question is, how do we get away from this? How do we get back to and internet of the people by the people. Not an internet solely made up of corporate interests. To be honest, I do not really have an answer to this question.
So, if we cannot trust search engines anymore to provide us with links to meaning, “real” content, who do we then turn to? Do we use search engines that skip the first 5000 results? Do we try to use more esoteric terms that mainstream companies have not thought to wrap with an enormous number of search results? I don’t know. That seems like a losing battle, as even those results can be gamed for someones bottom line.
Heck, at this point, we have come to a time when popular blogs are blatantly copied, and the redistributed by companies looking to make a quick buck. Companies that run search engines do not care. They just want to present the data and make their dime off of the user that enters an value into the search line. Think about it, if search is free, how do they pay for it. It is through gaining knowledge of what we search for, and how they can then convert it into cash.
To this extent, I am tired of the game. I will be upfront and say that my sight uses both Google Adsense and Google Analytics. Over the course of 20 years, I have not made $100 on this site. (Wow, that is kinda sad). But, I am going to move back to an old mechanism that was available on the web years ago. I am going to move back to a link page. And, I am going to link to blogs and sites that I, if not trust, at least respect. I am hoping that others will start to do the same. And, instead of searching the World Wide Web for new content, maybe you will look at the sites that I have linked to and see what they have to say.
I will not link to large corporations. That is not to say I will not create a link page for those purposes, but, I am going to make a page dedicated to people and creators that I find interesting.
Please help me in this. I want to find other people that are trying to have a voice.
]]>The following is a walk-through of how to enable Single Sign On in the AWS console. These instructions are for people with individual accounts to a few accounts. Actually, these instructions are just a good general starting point for smaller size orgs.
I like to add the Group first because then I can just add the Users to all the groups when I create them.
Setting up and configuring AWS Single Sign On is almost complete. The last remaining steps are to create Permission Sets and then to assign them to users or groups that are bound to accounts. As a default, I like to create an Admin and ReadOnly permission set. You can attach the users/groups to the accounts first, but I prefer to create them the other way and to create the permission sets, and then attach users/groups to the accounts.
All you have to do now is go to the link that is provided in the configuration. Once you log in, you will get a list of the accounts that you have available to you and the roles that you can assume in each one.
That is really all there is too it. It is quick and easy to setup, and after you start using it, you will wonder why you did not do it sooner. This is a great solution for solo developers, small companies, and even larger companies if you want to get into integration with your own federation servers.
]]>So, I decided that I was going to install a new Linux distro on my system and get back to writing and doing things. Out of curiosity I dropped in to distrowatch to see what the flavor of the month was currently. To my surprise I saw MX_Linux on the top. After the issues I have had with it, that was interesting to see. But, sitting at spot number 5 was Pop!_OS. Huh? I had heard of it, but thought that is was some simplified version for kids. I was wrong.
It turns out that Pop!_OS is an operating system based on Ubuntu by the folks over at system76 . System76 make Linux computers, laptops, and servers, and at one point used Ubuntu. But, do to some sort of falling out, they made their own distro based on Ubuntu. And, that is how we got Pop!_OS.
Enough of the history lesson.
How do I think it stacks up? So far, I am liking it. The desktop is running Gnome3. It has been a year or 4 since I used Gnome as my main desktop windowing system. For a while, I have been using XFCE, Mate, or Cinnamon. And, while all of those are perfectly good GUIs, I like the way that Pop!_OS has set it up. Either that, or I just like Gnome 3. It is a completely different experience than you get with other systems.
How does the system work on my laptop? There was 1 issue that I had to fix right away. On my laptop, the screen brightness was cycling through the various brightness settings. This is due to battery life help and auto brightness control. That can be changed in the settings panel, and after I did that, life got immensely better. (Having your screen change brightness constantly will make you go insane. Trust me on this one.) The next thing that I did was change the touchpad to not click on tap. While it was not overly sensitive, I have heavy hands, and always turn this off.
The next part comes down to typing. I like to write this blog and to do some coding on the side. With MX_linux I had to disable the touchpad for 1 sec after typing so that the location of the mouse would not cause my jumping to go to where the pointer is. This is a serious distraction when attempting to write code or to write just about anything. My experience on Pop!_OS has been great. I have been typing my thoughts about Pop!_OS for that last little bit, and have not had any issues. That is a huge plus.
But, what about when you close the lid on your laptop, and power settings? All of that worked out of the box. with MX_linux, I fought with it continually. With Pop!_OS I did not have to make a single change. By default I believe it suspends when you close the lid. It has not frozen at all on wake up, and the experience has been great.
Pop_OS! comes with the Pop!_Shop. This “shop” has a very large set of applications that are available for installation. Below is an image of what it looks like when you launch it.
From the Pop!_Shop it was easy to get other applications installed. I was quickly able to install Spotify, and it has links for Chromium (open source version of Chrome), Atom, Steam, and a slew of others. Also, it automatically checks for updates, and prompts you to install them.
It was a nice change of pace not having to add additional repositories in order to install some common applications. I use Visual Studio Code for dev work on Linux, and even it was there and easy to install, just make sure to use the .deb version.
If you do not want to use the Pop!_Shop, you can always fall back to the command line. That is my default for much of way that I run my system, and since Pop! is based on Ubuntu, which is based on Debian, apt and aptitude still work. Note, you will need to install aptitude if you want to be able to use it.
In case you are interested, below is a short list of preinstalled software:
For me the biggest switch was moving back to gnome and enabling keyboard shortcuts, or finding out what the keyboard shortcuts are. This has more to do with me wanting as many shortcuts as possible. Others don’t mind clicking the mouse to switch screens, but that is not the way that I like to work. So, I will share the settings that I use.
In Gnome 3 there are virtual workspaces that are located up and down from the main display. You can hit the Windows/Special/Power key and it will display on the right I prefer to quickly jump between workspaces by using the keyboard. So in order to do that you need to do the following.
Doing this gets the system to the point that I can use it without worry. Well, without wanting to throw it out the window. By default alt+tab will switch between windows on all workspaces. This is a benefit as, I have had to fight with other systems to get that functionality to work. Don’t get me started about windows.
After using Pop!_OS for just a short period of time, I think I am going to stick with it. The system has been easy to use and configure. It just gets out of the way so that I can get my work done. To be honest, I wish that I would have found it sooner. For me, it just works. Yes, I am a power user, but that is all good.
Even running Gnome3 on a 3 year old machine with 8 gigs of ram is fine. I will admit that I would be hard pressed to run any virtual machines on this, but for being able to build and run apps it is fine.
Another item that I like is that I did not have to fiddle with different system settings and repos to get base functionality. I have spent hours trying to get Fedora configured properly, and that is just a waste of time for me at this point.
If you are interested, I think you should give it a try.
]]>Now, I am not saying that we should return to the days of old, but it brings to mind that learning some of the core fundamentals is not something you get just by trying to get a system up. You have to go through different programs, and make the decision to move forward with learning the command line, and system fundamentals. And, with all the coding boot camps and quick starts, on occasion I am amazed anyone can find a good point to start.
That being said, I have been asked by some people that I know to help the gain a better understanding of the fundamentals of programming. There was the direct request to learn using python.
Python is a flexible language that you can do most anything in. And by anything, I mean almost anything. I started using it because it was a language that was available on Solaris that a co-worker did not want to use, and we at first did not want him on the project. My understanding, and knowledge of Python has changed a lot in the 8+ years since we decided to use Python for an internal company project.
That being said, it is a multi-disciplinary language. It is used for web development, system administration, machine learning, scientific studies…. In addition to this, other than the strange white spacing it uses to know what is going on, it provides a decent starting point to begin working with other languages. I have to admit, if you want to learn the true ins and outs of programming at a system level, Python will not get you there. For that you will need to dust off some books on C/C++. Although, I have heard that Rust is starting to replace some of that. Out of scope.
Back to Python and getting started.
I wanted to find a resource that would cover a broad range of topics when it came to Python, and also provided real world examples. When I was in college we spent a year going through a book on C++. That was difficult for the students, and many of them we lost along the way. And, after that year, I could not really write anything that would produce something that I could show to anybody. If it were not for the money I was forking out for college, I might have said what is the point.
After some quick searches, I chose a book to use. I decided to go with Python Crash Course, 2nd Edition: A Hands-On, Project-Based Introduction to Programming. (FYI) I don’t get any money if you use that link. The reason I chose this book because it has you actually do something that you can see results from.
Over the course of the book, you will work on 3 projects. I am sure they are not the most advanced projects, but they provide a foundation from which you can then launch your own projects from.
For a lot of developers, that is some great basics to cover. True, it does not cover writing a serverless application on AWS with Lambda and API Gateway, but, it goes about teaching a person how to think about an idea and implement it.
Note: This is all speculation. I am working through the book now with 2 people, but wanted to track my experience with it as I went through. But, I think one of the key factors is working through the entire book, and not skipping. The reason I say this is because I know a number of self taught developers that while excellent at writing code, do not know how to communicate their ideas with others.
Hopefully, by working through a book like this, it will teach enough of the fundamentals and language so that both of the people I am working with will be able to advance their careers.
]]>Because of these limitations, I have built closed and open source solutions to manage the complexities that involve working with CloudFormation. Recently I even started revamping a tool that I wrote years ago to manage complex CloudFormation Stacks. This update was done on the behest of a few people that actually utilize the tool and wanted to be taken from the messy state it is currently in, into something that could be tied into their current applications. It was from this that I began working on CfnMason as a python module.
However, recently I started working with CDK. CDK is Amazon’s CloudFormation Development Kit. My first thought was that it was going to be horrible, and why would anyone ever use it. Now, this was before it was a fully supported implementation and was only really viable when used with JavaScript. And, don’t get me wrong, I don’t hate JS, but I do most of my coding in Python these days. So, when I finally had a chance to use it for work, I found that I really like it, and that it was actually an excellent tool.
So, that brings me back to the original question. How do I know when it is time to stop working on a project? The main answer in my mind is you have to figure out that for yourself. When I started writing this, I was pretty sure that I was going to say that I am no longer going to be working on updating CfnMason. But, as I wrote this, I realized that not everyone is going to be able to move over to CDK. There are probably thousands of CFN stacks that have been created over the years, that require updates and tweaks, that are not a good fit to move over to CDK. As of yet, I don’t know of a way to take a template that is in AWS and convert it into a working CDK script so that you can develop on it from there.
This is why writing is sometimes the best way to find an answer to a problem, even technical ones. At face value, there are a number of projects that seem like they should just be discarded and never used again. But, once you analyze the situation properly, you might realize that there is a reason to move forward with development of a seemingly dead solution. It might even be to use it as a growth platform. Or, it could be that while there are new tools available, that for some, older and simpler tools are also still needed.
So, at the end of the day, what started as a note to say that I am no longer going to be working on CfnMason, has been turned around to me stating that I am going to try and get it done. Ha, yeah, even I laughed at that. Although now that the nation is in lock down, there is more of a chance that I might get it finished.
]]>I recently switched over to Poetry as a package manager for my project CfnMason. ( CfnMason is a tool related to CloudFormation stack management, but you can read about that on the Readme as it is updated. ) The question is, how do you use Poetry when you are working on a project across multiple machines and operating systems. I guess I am going to attempt to address the issue.
So, you want to join a project, or work on a project that is using the Poetry dependency management tool. Great! But, how do you get the requirements setup for the project so that you can start working on it? How do you know which version of Python to use, which packages to install, how to build the project, or how to run the test suite?
This is an issue that I was facing, but it was not with another project, but my own as I was switching between machines. Now, I do know some of the answers to the questions above, but I was still stuck as to how to setup a project on another machine. As such, I decided to walk through the process of coming onto a new project and determining how to work with it. The project that will be used for this walk-through will be CfMason, at tool that manages some aspects of building and deploying CloudFormation stacks on AWS.
Hopefully the project that you are working on has a Readme file. Though, to be fair, documentation is hard, and is often the last thing that is added to a project. If it does, you should be able them, but if they are not provided, then the following steps are the way that I would go about working on a project that uses Poetry for dependency management. Oh, and as a note. I am making the assumption that you already have Poetry installed.
If the ReadMe does not tell you that the project is using poetry, then there is a quick way to find out.
pyproject.toml
in the base of the project.[tool.poetry]
Provided you are able to find this file, and line, then the project is using Poetry.
One nice thing about Poetry is that it has a defined location to identify the version of Python. I am a big fan of this, as the difference between different versions can cause major problems. Take for example that reserved keywords changed a bunch between 3.6 and 3.7. The steps to follow are as follows.
pyproject.toml
python
, and find the supported versions.
python = "^3.6"
I am a huge proponent of using a virtual environment for each application that I am working on. In some cases, I will have 2, one for including all the development modules, and one for just the ones needed for the application to run. Since Python 2 is pretty much EOL, I am not going to spend any time on how to setup a virtual environment for Python2. Instead, this is all dedicated to Python3. And I can only guarantee this on Python 3.6 or later.
foo$ python -m venv venv-dev
foo$ python -m venv venv
foo$ ls -ld venv*
drwxr-xr-x 1 foobar 197610 0 Nov 24 17:02 venv/
drwxr-xr-x 1 foobar 197610 0 Dec 25 14:19 venv-dev/
For this last part, you need to activate either of the Python Virtual Environments and then run the install code from there. This is only if you really want to install it both ways. If not, then you can just create a single virtual env directory and just install all the dependencies.
Install all the dependencies, even the ones needed for development.
foo$ poetry install
Installing dependencies from lock file
Package operations: 10 installs, 0 updates, 0 removals
- Installing more-itertools (7.2.0)
- Installing zipp (0.6.0)
.....
- Installing cfnmason (0.1.0)
The other option is to just install the libraries needed to execute and run the module. I would almost prefer if it defaulted to the method below, but it works.
foo$ poetry install --no-dev
Installing dependencies from lock file
Nothing to install or update
- Installing cfnmason (0.1.0)
That is it. You should be up and running. At least to the point where you can get started with the project. Moving forward from this point will rely a lot upon how the project is setup, and how well it is documented. But, the big factor is that you can now start working on it while using Poetry, or you have the foundation to work on a project across multiple machines.
]]>