Consider the following algorithm to convert a list into a two-dimensional array (a matrix).

def list_to_matrix(myList, rows):
  list_length = length(myList)
    if (rows does not divide list_length):
      print("Warning: number of rows must divide the length of the list")
      return NULL

  columns = list_length / rows
  myMatrix = a rows x columns array
  list_counter = 0

  for row in [0, 1, ..., rows - 1]:
    for column in [0, 1, ..., columns - 1]:
      myMatrix[row, column] = myList[list_counter]
      list_counter = list_counter + 1

  return myMatrix


Let `n` denote the length of the input parameter `myList`. What is the run-time complexity of the alrorithm, using big-O notation?