CDNs, Postgres, JSON, datetime, PS tail, WebSites, Fish Shell, User Identification, Scrum / Agile / CI
Post date: Apr 15, 2018 4:09:26 PM
- Are CDN businesses being consolidated? Highwinds -> StackPath, NetDNA -> MaxCDN, MaxCDN -> StackPath. Sounds like there are too many CDN operators and those are being consolidated as usually happens when business starts to mature. Yet operating smaller CDNs with cost effective hosting can be a market sector which doesn't interest the larger players with higher price point and service level(s).
- Why Use Postgres - Nothing to add. Postgres is awesome, that's why I love PostgreSQL. JSONB is awesome, because often manual compex JSON -> SQL mapping is just simply painful. GIN and GIST indexes are used for full text and geospatial indexing. Also check out PostGIS and OpenGIS. Upsert is also nice, yet with simple use cases I can't stop loving SQLite3 Replace. Which does update / insert fro whole row based on primary key. It's just incredibly handy. You'll read / create data and then just store it, not caring about if it's already in the database or not. I like that workflow because for update you often need to read the data and verify some things anyway. After that logic storing the updated data back, should be as simple as possible.
- So much tuning with different kind of JSON APIs and working between Python and Go, well. After all it worked out as expected. Good project.
- Sigh, datetime with Python is such a mess, with timezones. time.time() is nice, as well as time delta. But datetime timezone is so messed up. But I'll manage. This is one of the reasons why I always prefer using UTC time stamps. The only sane way to go. It seems that there are annoying variations of the ISO time stamp. Some prefer to separate timezone hours and minutes with : and some others don't. Sigh. Also the standard strptime function doesn't detect 'Z' as valid timezone / UTC offset.
- Tail -f using PowerShell: Get-Content -Wait -tail 50 -Path logfile.etc
- That will nicely list 50 last lines and any new lines getting written to the log. That's very useful when monitoring logs in near real time.
- Cyber weapons being leaked. Who's trolling who. Great question and a good blog post by Schneier.
- I just seriously hate websites which do strange assumptions. Like DB X-Trackers ETF site. They assume that if you're English client, you would like to use Pounds. Or if you want to buy ETF from XETRA you would prefer German language. How annoying is that. What if I want to have my stuff in English and I want to buy stuff from Xetra with EUROs.
- Switched my default shell from bash to fish shell. In the good old days I used to use zsh, but that's decades ago. Also upgraded default Python version to 3.6, which is nice. It seems that the fish adopts really much stuff from Python. That's no surprise. I've often used python script to generate shell or SQL scripts. It's always good to use the tools you know very well.
- One of the reasons why many sites aren't using strong official identifying authentication is costs of it. First of all, you have to setup secure infrastructure, make integrations, and even then you'll need to pay running costs of that system. So far in Finland the authentication cost has been 0.50€ / authentication. But now it has been limited to maximum of 0.10€ / authentication. That's still quite high for many use cases. I know that even many major players have provided official authentication just for a while, while acquiring new customers. After that they've reverted back to classic user/pass login, because it's just so much cheaper! I just would love to get 50 cents on every Facebook, Google, Twitter logic. I'm also curious if there will be new service providers entering the nationally rusted official legally binding authentication market in Finland. If you think this is silly, I've heard there are still countries which use methods like utility bills for identification and other utterly ridiculous methods. - It was just lately in news that government is looking to reduce on-line user identification costs, because it's costing in range of several millions per year.
- There are now news that Scrum is too slow. That's just one of the reasons why I haven't been using it with projects for a long time. That's why continuous integration (CI) is being used. We've received a lot of feedback from happy customers due to CI practices. Why? Well in some companies getting things done takes 6 months, in some companies it takes around 4 - 2 weeks due to Scrum latency. But when things get serious and the speed is the key, CI is there to save you. In last integration project it took usually just a few hours to get stuff pushed into production when customer confirmed clearly what they want. Why software changes should take weeks, or months? Especially in cases where changes aren't that big. As I've written so many times, it's clear that whenever something is created, multiple iterations are required. This project isn't yet complete, but we've already logged 173 versions of the integration software deployed in production. What if the delivery latency would have been longer, due to some archaic release cycles or even two weeks due to Scrum? In some cases when there's a hour long Skype for Business meeting. If something is agreed in the meeting, the requested changes have been already deployed during the meeting, and it's possible to get second and or third iteration too within that hour. No problem! And trust me, the customers do value this.