Hurl, until now, was my favorite tool for API testing, but there is a new 'toy' on the market, httpx (https://httpx.sh/); if you are already familiar with JetBrains HTTTP Client syntax, you can just hop on writing tests in familiar syntax. Httpx is a total badass when it comes to writing 'multi-model' tests, like when you need to do something with RESt API and then reuse this data in PUB/SUB or even run something n SSH.
Apollo Server and Apollo Client are great and immediately added value to our stack and implementation of GraphQL.
Stay clear of Apollo Enterprise, unless your data usage qualifies you for the "team" plan. Enterprise is priced orders of magnitude higher, and might not offer your team any value. (Unless your supporting hundreds of users and dozens of teams)
Experience with the sales team was not great, as they are not really interested in supporting mid tier teams, and can cut access to your enterprise account if you don't upgrade. Recommend leveraging an alternative APM solution unless your org needs enterprise support.
I'm a developer for over 9 years, and most of this time I've been working with C# and it is paying my bills until nowadays. But I'm seeking to learn other languages and expand the possibilities for the next years.
Now the question... I know Ruby is far from dead but is it still worth investing time in learning it? Or would be better to take Python, Golang, or even Rust? Or maybe another language.
Thanks in advance.
I feel most productive using go. It has all the features I need and doesn't throw road blocks in your way as you learn. Rust is the most difficult to learn as borrow checking and other features can puzzle a newcomer for days. Python is a logical next step as it has a huge following, many great libraries, and one can find a gig using python in a heartbeat. Ruby isn't awful, it's just not that popular as the others.
Another reason to use python is that it is not compiled. You can muck around in the interpreter until you figure things out. OTOH, that makes it less performant. You really need to think about your use cases, your interest in lower-lever versus high-level coding, and so on.
Hi Caue, I don't think any language is dead in 2022, and we still see a lot of Cobol and Fortran out there, so Ruby is not going to die for sure. However, based on the market, you'll be better off learning Goland and Python. For example, for data science, machine learning, and similar areas, Python is the default language while backend API, services, and other general purpose Goland is becoming the preferred.
I hope this helps.
Apart from ingesting cloud data activity into SIEM via Google Cloud connectors, how is Exabeam using Google Cloud Platform. Is the Exabeam Fusion SIEM or Exabeam Fusion XDR SaaS offering rearchitected in way to be offered on Google Cloud Marketplace? If so, what architectural changes were made for the SaaS offering on GCP? (ie: Is it using Google Cloud specific technologies for analytics for example?)
For Posterity: Next phase of Partnership aims to use Google Cloud Analytics suite products (BigQuery, Cloud Spanner, Dataflow, Looker, PubSub) to create a next generation hyperscale cloud-native SIEM and Cybersecurity Analytics offering. It seems as through the partnership has gone through 2 phases so far. Phase one sounds like PubSub connector, Workspace connector, and BigQuery connector for existing Exabeam Fusion SIEM and Fusion XDR to simplify ingestion of cloud activity into Exabeam to extend security analyses. Links: Phase 2: https://www.exabeam.com/newsroom/exabeam-partners-with-google-cloud-to-create-hyperscale-cloud-native-siem-and-cybersecurity-analytics-offerings/ Phase 1: https://www.exabeam.com/siem/fusion-xdr-and-fusion-siem-now-available-in-google-cloud-marketplace/
I wanted a badge on Github so visitors to the repo would know immediately that they were dealing with something tested. Racket's rackunit testing library had a pre-existing integration for coveralls and it was quick and easy to get it going.
I don't use much of the fancier parts of coveralls but it does exactly what it says on the tin. Tracks your coverage, reports it faithfully over time.
Sverige är ett av de mest utvecklade länderna i Europa med hög medellivslängd och låg brottslighet. Det anses också som ett av de mest jämställda länderna i världen på grund av dess politik för "positiv diskriminering" som syftar till att främja kvinnors rättigheter och jämställdhet. sverige karta Sverige har också en mycket hög läskunnighet med 99 % läskunnighet bland vuxna i åldern 20-24 år och 98 % bland de i åldern 25-34 år. Det svenska språket talas av cirka 9 miljoner människor världen över, mestadels i Sverige och Finland, men även i vissa små delar av Norge, Danmark, Kanada, Ryssland och Kazakstan. Sverige är känt för sin naturliga skönhet, vilket kan ses på dess karta. Sverige är ett land i norra Europa som gränsar till Norge och Finland. Huvudstaden är Stockholm. Det är en konstitutionell monarki med en parlamentarisk demokrati och en vald monark, för närvarande kung Carl XVI Gustaf. Sverige har legat i framkant av demokratiska förändringar i Europa sedan 1700-talet. karta sverige Sverigekartan visar de många platser som är värda att besöka i det här landet. Några av dessa inkluderar Stockholms skärgård, Lappland och Gotland. Sverige är ett land med en befolkning på drygt 9,5 miljoner, beläget i norra delen av Europa. Det är känt för sin höga levnadsstandard, yttrandefrihet och demokrati. Den svenska regeringen har undersökt olika sätt att använda teknik för att förbättra landets livsstil. Ett sådant sätt är att använda AI och robotik för att kartlägga Sverige för turister och besökare som behöver hjälp att navigera i landets vidsträckta landskap. Sverige har en avancerad ekonomi och den har även attraherat några teknikjättar som Spotify, Skype, Ericsson och Volvo Cars.
I am using the Django with Django REST framework for backend and API.. for front-end using React..
- What will be the best way for deployment. Backend and front-end are at different domains or both at the same domain...
- My app will be used concurrently by more than 1000 users. How will I achieve performance? Is there any suggestions...
Best advice will be to put a reverse proxy nginx routing certain calls to frontend, and certain to backend. Best way to build will be Docker images and a docker-compose file for starters, where you will define the database, the backend, the frontend, and the reverse proxy nginx, which you can configure to use certificates and certbot from Let's Encrypt. You can worry about performance later, at which point you will probably need to migrate towards Kubernetes. Standard setup with docker-compose should serve you well for up to 1000 users. You can initially use the same domain with possibly /v1 routing keys towards backend, and the rest towards frontend, which should be a plain Nginx image with all of the frontend files compiled. You can compile them using multi-stage Docker builds, or just plain CI/CD such as GitLab.
In a global sense, if there is such an opportunity, and if it is applicable to your project, I really advise you to change the technical stack. Try using static UI frameworks (Svelte) and server oriented fast languages (Java, C#).
But if we take current conditions, then your option would be to audit absolutely all technologies. Review your data consumption, such as how fast your database response is, how well you use the power of both languages, such as in optimization. Make sure the quality of your server provider, database, other technologies. If you're using an image extensively, consider a CDN.
As for placing the API and the Website on the same domain, the choice is yours, but in my experience, mostly people prefer the api sub-domain (example: api.example.com), it's easier and clearer.
These are big and complex decisions, but they will affect UX, and therefore your income :)
Useful resources: https://reactjs.org/docs/optimizing-performance.html https://docs.djangoproject.com/en/4.0/topics/performance/ https://www.sisense.com/blog/8-ways-fine-tune-sql-queries-production-databases/
A senior developer from our team Shoplifters says the following about what the Elixir Squad is currently working on:
“The Elixirsquad is currently working on the implementation of new features for our e-commerce platform such as a compatibility search for SmartHome devices from different manufacturers. Not all of the products listed here can actually be ordered via the connected store systems, but rather serve as a general planning aid for the end customer when selecting products. However, finding and identifying planning errors and bugs and working out a solution for them is also a constant item within the team.”
I asked the same Shoplifters member what is the greatest potential of Elixir and Phoenix:
“Elixir and Phoenix provide us with powerful tools to investigate the topics mentioned. Despite the rather dynamic nature of the language at first glance, the ecosystem provides us with powerful tools for static analysis to identify even the not quite obvious module dependencies, even in a complex system like our platform backends, and to develop strategies for solving problems based on this. At least that's where the potential lies from a developer's point of view. The economist will like the fact that Elixir allows a relatively compact programming method, which can do without unrecoverable error handling to a large extent and can nevertheless work fail-safe and error-tolerant, if one keeps to basic rules in relation to the "process supervision".
Finally, I asked him what advice he had in store for Elixir newbies:
„Elixir is not an object-oriented language, even if it has adopted many syntactic elements from Ruby, the semantics behind it are quite different. Please take this to heart, and don't try to find a substitute for objects and classes in processes and "actors".”
Azure AD and SSO was seamlessly integrated and allowed for easy integration with all my other tools. By adopting this out of the gate, I also did not need any AD servers, which is beautiful. Who wants to maintain servers anymore? Also, by purchasing this one plan, I was able to remove the need for any SSO vendors like Okta or Ping. The authenticator app also made MFA easy, simple, and safe, and we're fast approaching a passwordless login. Couple that with all the MDM and AV capabilities and it's hard to choose another solution. Especially if you're greenfield.
cruft allows you to maintain all the necessary boilerplate for packaging and building projects separate from the code you intentionally write. Fully compatible with existing Cookiecutter templates. Using cruft , you can make sure your code stays in-sync with the template it came from for you.
I recommend the usage of cruft in all of my cookiecutters.
We needed a sophisticated help center solution as we provide extensive support with 3 support team tiers. Intercom is more about messaging and automation. We needed sub-tickets, tasks, reminders, SLAs, automation, and strong API connections. That's why we have chosen Zendesk and we are quite happy with that.
Hello Dear Developers, I am a newbie in Full Stack Development, I have been assigned a full stack application that manages text and user data. I have started learning Node.js and I know some basics of Node JS. The Project is assigned to me by my University Professor. I and my team has developed a good Front End for our Website in React (we are familiar with JS). What Backend we should use?
Instead of using Node.js, as you are familiar with SPA I'd try to do as much logic as possible in the webapp. By using Firebase DB browser library, you can authenticate users with a variety of OAuth/password methods, store user data, and use DB authentication rules to grant read and/or write permission by various rules. With the right combination of Firebase tech, you can likely avoid any nodejs at all... (Unless it is a class requirement to have some nodejs, then maybe have a serverless cloud function that does some admin action like cleanup old data.)
There are two Firebase DBs: Firestore, and Realtime Database. Start with Firestore because it is the recommended technology.
I would definitely go with Strapi as a backend for a project like that. It already as great features like user signup/login with various providers and has a proper permission management. I has a very good API that is easy to use and data can be accessed with GraphQL if needed. I found it easy to run locally for development and the deployment process can be really easy as well.
Supabase is also an alternative I have tried once but it had less features than Strapi when I tried it at the time
Redis is an amazing in-memory #database. But that's all it used to be, a really amazing distributed hash table to get or set key-value pairs. But recently while consulting for a startup I came across Redis Enterprise and it totally blew my mind away.
I have never seen a big database company totally revamp its offerings. For example, MongoDb is still trying to find a way to monetize their DB by changing the license to Server Side Public License. ElasticSearch did something similar and that's why Amazon created OpenSearch(a fork of ElasticSearch).
What can Redis Enterprise do?
They introduced bloom filter and cuckoo filter as first class citizens. Bloom filter is probably one of the most underrated yet one of the most widely used algorithm I know of. Google Chrome uses Bloom filter to check if a URL exists in their malicious URLs Dataset.
Besides that, Redis also introduced a new JSON module that lets users save any JSON objects and then allows the users to run queries on any fields.
Redis Enterprise has a lot more modules such as TimeSeries, Streams, AI, and more.
The entire ecosystem is really new but it totally has the capability to replace a whole suite of document databases, #caches, and indexed search systems.
The only "con" would be the pricing. It's a bit expensive and also it's not #opensource but has Redis source available License(RSAL).
I am pretty sure some companies are already working on an open sourced alternative to Redis Enterprise and various modules.
It's just amazing to see how databases are evolving. I would have never imagined someone using Redis as their main database but it's happening.
What do you think? Would you trust #Redis with your Production load?
As we're developing a critical piece of software, type safety is very important to minimize the errors we have. While Python supports type hints nowadays, Go makes it much more easy to work with and allows us to be confident in the software we ship.
Take look at our code in our github
pnpm is one of the newer additions to our frontend stack. We used
yarn for a relatively long time but recently decided to reevaluate its usefulness. pnpm caught our eye for its speed and more efficient disk usage, and even though it is less popular than its competitors, we decided to place a bet on it.
We chose Dapr so that our developers could leverage managed cloud resources without having to write "infrastructure as code" code. You can easily connect different providers for the same common abstractions, such as a State Store, a Secrets Store or a Pub/Sub mechanism. We don't use the Actors functionality, though.
I've been approached by a business consultant for programming a website + web application for his client, which is a logistics company. The web application will have a tracking system for tracking their GPS enabled fleet (400 tricks).
Kindly advise me which scaleable stack can I use for the back-end. I'm planning to use React for the front-end.
And by back-end, I also include the database. I'm considering PostgreSQL as the database system.
Spring is a good decision for your needs, but you should build correct microservice architecture for good scaling. Work with database can be easy with ORM (e.g. Hibernate) and migrations (e.g. Liquibase) If you need the best performance and scaling on frontend, you can use Angular or React.
Fauna is a serverless database where you store data as JSON. Also, you have build in a HTTP GraphQL interface with a full authentication & authorization layer. That means you can skip your Backend and call it directly from the Frontend. With the power, that you can write data transformation function within Fauna with her own language called FQL, we're getting a blazing fast application.
Also, Fauna takes care about scaling and backups (All data are sharded on three different locations on the globe). That means we can fully focus on writing business logic and don't have to worry anymore about infrastructure.
I have been using Firebase with almost all my web projects as well as SwiftUI projects. I use it for the database as well as the user authentication via Google.
Is it good enough?? I have learned MySQL but I'm not that comfortable…
So for user authentication and database should I keep using firebase or switch to MySQL or MongoDB?? Or any other combination?
I’m not an expert, but I can tell you some things:
- Firebase is a great option for a very simple to implement, fast and reliable authentication method. Nonetheless, the free authentications are limited, so if you will potentially have millions of monthly authentications, it’s probably best to take the time to build it into your app directly.
- MySQL is great for simple tables where the data structures are not too complex, but it lacks some speed when you are trying to retrieve time data series. Also, I believe it’s a bit more difficult to distribute.
- MongoDB is great when your information is a bit more complex and you need very peculiar data structures, nested data, dynamic structures, etc. For me at least, it’s a bit more complex to master than MySQL, but the freedom it gives you is incredible. It also performs super fast, especially with time data series, and if I’m not wrong, it’s more scalable.
In general, almost all technologies have their good things, it’s just a matter of what you want to do and then choosing the right ones.
Look if you are comfortable with firebase you can go with it, after all, It's all about development and running your program bug-free and fast, but firebase is costly fo long run and if you are comfortable with that cost then I suggest you go with it.
I am not really comfortable with it... I am currently using the free tier. But i want to switch from it.. I am an enthusiast so i wont mind doing the extra work... I just want a scalable robust platform that is relatively easy to use
When I am making a website, I do NOT use an online database, because APIs are hard to learn, and just writing to files is easier. So when I code a website, I write to files for a DB. Oh, and by the way, there isn't a single file for a DB, each key has a file, look at the tree below for an example.
database - --- key.txt --- other key.txt
Yea, I find that works really well for most use cases. Only if you want to store a really large amount of data does it make sense to use a proper database.
As an advanced user, I prefer Postgres over MySQL. MySQL was the first database I learned from my institute. I always have to undergo that infamous date and time dilemma many Java devs know. Both are adequate for a small project. When I worked on a project with a date and time-intensive data, I spent a lot of time dealing with the conversion and transition, leaving me frustrated. I tried Postgres to see how well it can perform. To my surprise, all became a breeze, and the transactions were faster too. I've been using Postgres ever since, and no more dilemma.
Hey, Alfred! I'm glad you made the switch over to Postgres! If you're looking for a highly reliable ORM, I would recommend Prisma, you can check it out at www.prisma.io - Have a great rest of your Sunday!
Thank you for the information. I will consider trying it sometime.
Creating a reliable, scalable app quickly has become much easier and more straightforward. The Microsoft extensions of the runtime libraries make implementing dependency injection, configuration, and logging very convenient and reduces time spent writing boilerplate code. LINQ makes working with data a breeze especially when you have data coming from a variety of sources. Although Entity Framework Core can be a bit much and takes some time to get used to, I think it is a great ORM overall and has a lot to offer. Possibly my favorite part of building .NET apps is the deployment options available. Deploying an app as a single-file, self-contained, executable has been a luxury when deploying to machines without .NET Framework installed and/or installing the latest version was not an option. Single file deployment bundles all application dependencies and the framework into a single executable. The only other file that needed to be distributed was the appsettings.json file. Hosting the app in a Windows Service is cool and definitely beats working with Task Scheduler. It's clean and simple, which is a nice break from the norm.
I had used Svelte in one of the projects a year ago. I found it's a compact and a sleek framework. However, I wonder why it is not being used like other frameworks. I would like to use it if there's an opportunity. Component development is a breeze. Any views on this or any opportunities in Svelte would be appreciable.
We originally had used Algolia for our search features. It's a great service, however the cost was getting to be unmanageable for us. Algolia's pricing model is based around the number of search requests and the number of records. So if you produce a large number of small records the price can quickly get out of hand even if your actual dataset doesn't take up that much space on disk. Spikes in internet traffic can also lead to unexpected increases in billing (even if the traffic comes from bots)
After migrating to Typesense Cloud we have been able to save A LOT of money without losing out on any of the performance we got from Algolia. I do not exaggerate when I say that our overhead for search is less than 25% of what it used to be. Typesense also has the following advantages:
Their cloud offering lets you configure your Typesense nodes and specify how many you want to spin up. This allows you to manage costs in a manner that is way more predictable. (You basically pay for servers/nodes instead of records and requests)
It's completely open source. We can spin up a cluster on our own servers or run it locally.