Share: mail

PCC is about to launch Starfish, a software platform that sends early-alerts to students based on their class performance and faculty concerns, then puts them in contact in real time with the resources designed to help them such as counseling.

Valued to be approximately one hundred thousand dollars, this data-driven system will introduce students, faculty and counselors to a new way to communicate their concerns and take action immediately. It will allow them to be – literally – on the same (web)page.

“You have been starfished!” is the deceptively cheerful message students will receive when faculty and counselors will judge that the particular student would benefit from some sort of intervention and support. Without leaving the online platform, the “starfished” student will be able to make the necessary appointment on the spot.

“This is all very new to us,” counselor Desiree Zuniga said. “We were notified one or two weeks ago and now we are getting started.”

Other colleges and universities have already been using these kind of data-driven system based on predictive analytics. So far they proved that making data-informed decisions in education institutions leads both to potentially significant positive outcome or embarrassing failures.

The Georgia State University uses the Graduation and Progression Success System, which sends an early-alert to a counselor when the student receives an unsatisfactory grade in a class relative to his or her major, doesn’t take a required class within the recommended frame, or signs up for a class not relevant to his or her major.

The system has been carefully crafted and the Atlanta based four-year public university has seen a measurable improvement of the academic performance of low-income and minority students.

Leaders at PCC are also looking forward to use the data collected by Starfish to help the students.

“We will be able to learn more about our students and challenges they face,”Dean of Counseling and Student Services, Armando Duran said. “With this type of information, we can provide the resource and support needed.”

Software that collect student’s data and use algorithm to identify those who need intervention are extremely helpful for school’s administrators. They are powerful tools but it’s at the user’s discretion to decide if they will use them to integrate or exclude students.

In February 2016, the private campus of Mount Saint Mary’s University, Maryland, made the national news when the school newspaper advisor was fired for an article quoting the school’s president Simon Newman comparing struggling freshmen to bunnies that should be drown or shot.

Newman had conducted an online campus survey and from the data he was able to predict which students were more likely to drop out along the way. Instead of choosing to offer support, he encouraged them to leave the school before they would be included in the retention data that the institution reports to the federal government.

The kind of data that predictive models like Starfish can include varies from demographic data, test scores, high school and college GPA, class attendance and student behavior and the algorithms can use them in different kind of combination.

PCC counselors haven’t decided yet their software’s settings and configuration.

“Information such as the reasons why students miss class, do not submit homework assignments, and why students are not engaged in their class,” Duran said. “We are looking for common themes and behaviors that impact student success.  The non-cognitive factors that impact student success to be exact.”

In a demonstration given during a conference call between the PCC counselors and a Starfish representative, it is clear that just by adding a student’s picture in the system more information are available to those who access the data, such as genre, race and eventually a disability.

“I’m a bit concerned,” counselor Tomas Riojas said. “I’m not sure I want to have all these information as a counselor before meeting with the student. What about the risk for implicit bias?”

Other people have similar concerns.

The Washington-based think tank New America published an article written by policy analysts Manuela Ekowo and Iris Palmer with the explanatory title “The promises and perils of predictive analytics in higher education.”

“Colleges are really excited to have a new tool to identify students who need support and intervene before it’s too late,” Ekowo said over the phone. “But there is a dark side to it.”

Ekowo and Palmer warn institutions of the risk of unintentionally reinforcing inequality when targeting students for intervention using race, ethnicity, gender or socioeconomic status.

“Students of color or of lower income are traditionally more at-risk students,” said Ekowo over the phone. “When they start college, are you going to give them a fresh start or are you going to assume they’ll need to be flagged?”

Other than the risk of profiling students, Ekowo and Palmer invite institutions that use predictive analytics to come up with some ethical guidelines regarding transparency, privacy and information security.

Students educational records are protected by federal law, but the last update to the Family Education Rights and Privacy Act (FERPA) was made in 2001. Since then, colleges are collecting new kinds of students data and digital traces of learning activity are rapidly increasing and evolving.

“Early-alerts systems often rely on records from advisors that help determine the appropriate next step for a student after each visit,” said Ekowo.

In the Starfish system adopted by PCC there is a field named “notes,” where faculty and counselors can write down their comments about students. It’s not clear so far, if these notes will be considered as educational record. If so, students must know that they have the right to review them and correct them if necessary.

Sharing information is also a concern.

“Sharing these records without the student’s consent or with parties without a legitimate educational interest may be a violation of FERPA,” said Ekowo.

The launch at PCC is imminent and Duran said he will first introduce it to a smaller group of students, those enrolled in the Pathways program.

As for some ethical guideline specifically addressing the use of predictive analytics PCC, administrators have not yet formally expressed their thoughts.

“We have received various training and professional development opportunities on this very topic,” Duran said. “We anticipate continuing to seek professional development opportunities on implicit bias and related topics.”

Follow: rssyoutubeinstagrammail

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.