Please note! This essay has been submitted by a student.
The Spectrum brand is the result of a merger between Charter Communications, Bright House Networks (BHN) and the Time Warner Cable (TWC). The parent company of Charter Communications purchased BHN and TWC, forming one of the largest cable operators in the United States.
Charter stands in second place as a cable operator in terms of the number of subscribers, which provides its services by the brand called “Spectrum” across the United States. Also, comes third in terms of the pay TV operator and fifth, when compared the number of residential subscriber lines.
The coverage of Charter Communications is huge when it comes to offering their services in 41 states for nearly 25 million people, which can be counted as to the coverage in 48 different states in the United States. As previously mentioned, after the merger, there was an upgrade to an all-digital network for Charter’s voice, broadband and video services, which can be called as the re-branding of its residential services to Charter Spectrum. Also, the state-of-the-art fiber-optic service delivery system, when compared to the old coaxial cable-based network infrastructure (on which Charter previous relied on mostly) has delivered higher bandwidth speeds.
Charter Communications has various divisions including but not limited to Advanced Engineering (AE), Network Operations, Wireless, Product, Security, and Data Analytics. I was a part of the Network As A Service (NaaS) team of the Advanced Engineering, where the mission of the team is to implement network automation using forward thinking concepts such as Software Defined Networking (SDN), and intent based to reduce operational expenses and improve reliability and security.
This mission would be accomplished by minimizing manual touch in favor of automation, and by providing an automated tools framework for use within the organization. The different areas that the team is involved in are the Service Provisioning, Configuration Management, Network Abstraction and Event Processing for Network Automation.
The learning outcomes of the internship can be described as the research and development of solutions for network configuration management, network event correlation, fault isolation, root-cause analysis and potentially applying an unsupervised machine learning approach. For these kinds of needs, the NaaS/Network Automation team is responsible for the development of the correlated applications and services.
The requirements of the internship include knowledge of the programming language Python, Agile Software Development and the development of the REST’ful APIs. The other technical skills required are the knowledge of SDN/NFV frameworks, understanding of networking concepts and comfortable working with Linux and Docker Containers. Also required to have strong interpersonal, organizational and analytical skills.
An understanding of the telecommunications industry was desired along with a good understanding of the various networking concepts. Also, needed a cumulative GPA over 3.0 and proficiency in MS Office programs. Any previous internship experience was considered as an advantage and required to possess excellent oral and written communication skills. A strong passion with respect to the levels of energy, enthusiasm and creativity was desired along with practicing good judgement, discretion and confidentiality.
As part of the NaaS (Network as a Service) team, I was involved in contributing towards two areas in the team for future deliverables and various proof on concept initiatives. But before diving deep into the two areas and working with the team, I was working on testing and comparing an open source query language with one built in-house by the NaaS team. This project was a joint effort with fellow interns on the NaaS team of AE at Charter. The advantages and disadvantages of the two query languages were presented to the team after thorough analysis of the two query languages with different possible use cases.
The first area that I was working on by pair programming with one of the principal engineers of the team was development of APIs for specific business needs for the NaaS application architecture. Below is the description and the method involved in developing two significant APIs for the team.
The first API developed was the dynamic query API, where initially I built a model/schema for the relational database using SQL-Alchemy. Then developed the controller, which gets the ID of the SQL query from the user and then the query is executed making use of the related tables from a relational database. The different attributes of the dynamic query include the ID, the SQL query, name, type and the author pertaining to the query.
This API is useful for rapid prototyping because it provides a simple mechanism for application components to obtain data from a database without writing a custom API. The REST API supports create, read, update and delete allowing developers to manage the lifecycle of a query. Since the API allows users to store complex SQL queries, the API was able to return relational data from any set of tables in the database.
The second one is the application configuration CRUD API built to provide initial configuration data to an application at start time. Here, initially we built a schema for a Postgres database with an ID and JSON data as the columns in the table of the configuration database. The JSON data column was used to allow the application developer to store structured data without having a rigid schema beforehand. Then I built the CRUD API which is helpful for the Create, Read, Update and Delete operations on the database.
Then, the swagger documentation has been constructed, which was helpful in checking different CRUD operations developed for accessing data from the table of the configuration DB. Also, several unit test cases for each of the API calls were built wherein the results of all the test cases were a success. Finally, the entire code developed for moving the configuration DB to a relational database has been deployed into a container ecosystem so that the service can be used in the future by the members of the NaaS team. The second area that I was working on was the Artificial Intelligence (AI) project which is described below.
The goal of this Artificial Intelligence project was to provide a unified tool to capture a device’s state and show the difference in the states at different points in time. While the use cases and specific technologies for this initiative are proprietary to Charter Communications, the following should give some insight. The solution contained the following components:
Graphical User Interface – Network engineers only interact with a web-based User Interface to specify the following:
After specifying the parameters, the user clicks “Run”.
Workflows – The tasks/actions are called the workflows. The parameters from the UI are used to parameterize the workflows. The workflow engine will execute the parameterized workflows. Network Controller (NC) – The workflow obtains data from the device from the network
Data Base (DB) – During the process of execution of the workflows, the information which is retrieved from the interactions with the network controller will be stored in a database.
Machine Learning (ML) Algorithms – The data for the ML algorithms is retrieved from a database. The retrieved data went through the process of data preprocessing where the data fed to the ML algorithms was formatted and normalized. The ML algorithms that we used for our use case were the Naïve Bayes classifier, Support Vector Machine and the Clustering algorithms which are the supervised machine learning approaches to determine success or failure for the given use case.
Each of the ML algorithms said above, were able to predict the correct state of a use case using the input values for the attributes. Also calculated the metric values namely Precision, Recall, Accuracy and F1 score for all the classification algorithms mentioned above.
The general tools and technologies that I learned during the course of my internship are API development, workflow engines, containerized services, data modeling, relational databases, NoSQL databases, query language development, publish/subscribe busses, telemetry data, clustering, Naïve Bayes Classifier, Support Vector Machine, REST’ful services, Unit Testing, Swagger Documentation, Gitlab, Postman, SCRUM tools and various Python packages like flask-restplus, sklearn, psycopg, Jinja, PyYAML, SQLAlchemy, python-jose, gunicorn, flask-RESTful, flask-swagger, pyJWT, pycurl, Flask-Brcypt, WTForms etc.
Below are the different courses and the description of how they helped me gain valuable experience throughout the course of this internship:
This course made me learn more about different data structures like Binary Search Trees, Binary Trees, Heaps etc. and the course assignments made me proficient in the language Python, which was the primary language used in the internship to develop various API’s and Machine Learning algorithms. Also, the concepts of analyzing the complexities of different Machine Learning algorithms helped me develop algorithms which are highly efficient in terms of both time and space here at Charter.
This course helped me get deep knowledge on different Machine Learning algorithms such as the Naïve Bayes Classifier, Clustering, Support Vector Machine, PCA based Anamoly Detection etc. which helped me to apply the same for the AI project described above.
This course helped us to write grammar for a language and made us develop the Lexer, Parser and Interpreter for a Python-like language using Python from scratch. This helped me during the internship in understanding a query language which was developed by one of the Principal Engineers from our team at Charter, which is a Domain Specific Language using Python and thereby testing the language with different basic network related audits.
These courses from two different semesters helped me gain deep knowledge on the process of Agile Software Development, which helped me to attend several daily stand-up meetings and speak regarding my work during the day and if I have any blockers, during my daily time at Charter. These courses and related projects also helped me understand the importance of User Stories, Scrum and Pair Programming, where I gained valuable experience in assisting the team in the development of different APIs, ML algorithms, several unit tests and gained huge basic knowledge of the computer networking domain.
The project that was a part of this course helped me learn about the JSON objects, parsing the JSON and some of the concepts related to building the Web APIs which helped me during the internship when I was building the APIs and when trying to analyze the network configuration in different JSON objects, which were described previously in the report.
After this internship, I have learnt a lot about API development, RESTful services, Network Configuration and Management, application of the Machine Learning Algorithms in a computer networking domain etc. Also, I have learnt about the Docker, which helps us have the applications/services developed and put in a container so that they could be used by other developers in the future, which also helped me gain experience in real-world applications.
All these concepts would help my Applied Project/Software Factory in the final semester or in the courses of the future semesters at ASU in successful completion of the same. Also, the knowledge acquired during the period of my internship at Charter as describe above in different tools and technologies would help me in developing projects that are useful for other engineers in the future at ASU and making the world a better place for the people to live in.