UC Berkeley Faculty Association

Join the BFA Join the BFA

Appeal to Suspend “APBEARS”

APPEAL TO SUSPEND “APBEARS” UNTIL IT WORKS PROPERLY

TO: EVC George Breslauer

CC: Vice Provost Sheldon Zedeck, Associate Vice Provost Angelica Stacy, Associate Vice
Chancellor Shelton Waggener, Dean Andrew Szeri; campus offices
FROM: The SAVE Coordinating Committee

Evidence and testimony from across campus document that APBears was “rolled out” in a
condition unfit for the use of staff and faculty. Moreover, it is potentially damaging to faculty
who are now preparing dossiers for promotion review (see below). It is widely understood
that some administrators in charge of this system knew about its deficiencies but suppressed
this knowledge and refused to correct problems so as not to miss their “rollout” deadline. If
they have not apprised you of this situation, we do so now and ask that you investigate it.
Academic Senate Chair Fiona Doyle has already written to VP Zedeck and the chair of the
Senate Budget Committee about this problem; VP Zedeck has responded to Deans and Chairs,
but his response addresses only one of the system’s problems (the CSIR data).1

We believe that faculty should be able to use an accurate and well-designed online academic
personnel system. APBears, as it now runs, is not that system. We therefore ask the following:

1. Suspend APBears immediately and re-implement it only when a taskforce of campus
department faculty and staff has reviewed its functionality and seen that the necessary and
advisable changes to it have been made. Until then faculty should be permitted to use the
present bio-bib case system.
2. Make certain that any new iteration of APBears contains clear statements about which data
are supplied by external (non-faculty) sources, whether their errors can or cannot be
amended, which information is supplied by faculty, and that faculty are responsible only for
the information that they supply.
Here is a partial list of what is wrong with APBears:

1. It is full of procedural errors, glitches, and technical problems that are time-consuming to
fix or work around. Some of these could be fixed with diligent review of a committee of
faculty from all levels and across campus departments. We ask you to convene this group.
1 VP Zedeck’s response suggests that the CSIR data are compromised not because that system is flawed
but because Departments “corrupted” the data by, for instance, “adding GSIs as instructors to the
primary section to give them access to bSpace; inadvertently double booking classrooms.” If so, it points
to the all-too-familiar ways in which Departments— faced with IT systems that are non-integrated or not
workable in practice— have been forced to create work-arounds so that teaching and research on campus
can proceed. Pace Zedeck’s concern, the real issue is that the compromised data problem will continue to
exist (for all previous years in which the data are fixed in the public record) and cannot be remedied until
the reasons why departments have been forced to create work-arounds are addressed.

2. The data supplied by the Administration on teaching and mentoring are unacceptably
inaccurate. One professor found that she was credited with only 25% of her teaching;
another 400%. The requests for detailed data on student mentoring and employment are
unnecessary and burdensome; faculty are not the HR office. The first problem cannot be
fixed because it involves CSIR data; the second problem can be fixed by switching from a
pull-down to a narrative system and eliminating several informational field requests.
3. Faculty are prohibited from correcting many kinds of errors in the system, and some
apparently cannot be corrected by anyone. Despite these errors, faculty are being required
to sign a statement that they have read and approved all information in their files, even
though they cannot see some of it. Faculty should not be coerced to sign a document that
they cannot fully review. This situation is indefensible and probably legally actionable. It
could be partly fixed by prominent statements on the website that acknowledge clearly and
fully that the accuracy of CSIR data is in doubt, note that data uploaded to the system at the
time of its roll-out cannot be corrected, and that clarify that faculty are responsible only for
the accuracy of statements that they upload to the system. This does not solve the problem
of the proportion of CSIR data that is incorrect or unanalyzed, but it may improve the future
accuracy of entered data.
4. Current estimates are that this system typically takes 20-40 hours longer to prepare than
the traditional case procedure, and this does not include the “one-time” uploading of
personnel data and historical material. This could be remedied by eliminating requests for
some data, removing the pull-down menus, and allowing greater use of narratives
uploaded by the faculty (see 6, 7).
5. Our faculty are incredibly diverse in the products of their research, the modes of their
teaching, and the scope of their professional activities. The “pull-down” menus are time-
consuming and do not encompass accurate descriptions or alternatives. These should be
eliminated and a greater narrative freedom built in; otherwise faculty may as well just
append accurate bio-bib statements and ignore the data fields.
6. There is no way to rank the importance of many activities; hence, a talk to a Cub Scout
troop is featured as prominently as election to a national academy. Chairing a panel could
mean a lot of work or none at all; there is no role for “convener” or “organizer.” The roles of
authors in publications are also not adequately assessed. This could be fixed with greater
narrative freedom and the abolition of pull-down menus.
7. The extent and kind of data being gathered represent an unreasonable burden on the
faculty. Many of these data have to be entered in three different ways, which is redundant
and time-consuming. Many categories do not accurately or adequately assess work done on
a project or activity, and represent a “one size fits all” approach to professional activity and
achievement. The redundancy and unnecessary fields should be eliminated.
8. It has not been thought out or clarified to the faculty how external referees will access
case information in this online system of mixed and risked confidentiality. Currently the
old “hard-copy” approach is being used. Why, then, the new system?
9. Department staff are spending an undue amount of time learning this system and trying
to interpret it and fix its problems for faculty, at a time when they can least afford to do so,
given additional job burdens related to staff cutbacks.

These problems are not simply a matter of system “growing pains” or “first time only”
problems that are finding speedy remedy. They appear to be intrinsic and endemic. This
system was put into place and mandated before it was ready. The entire faculty and staff
should not have to be the guinea pigs for this. Let’s not repeat the errors of the BFS system.

We estimate conservatively that the extra time this system imposes upon the faculty will cost
the campus well over $300,000 in faculty time this year alone.2 We cannot begin to estimate the
loss of staff time. The argument that there will be time saved down the road is not sufficient
justification for implementing a system (and APBears is by no means the only one) that has not
been adequately reviewed, tested, and corrected before implementation by its principal users.
Nearly every IT system on campus winds up making the faculty spend time entering data and
negotiating systems that are not effectively designed to help research and teaching, but to make
the jobs of administrators easier. In the end, however, this does not happen, because the
systems – whether BFS, RES, or APBears — are so flawed that both administrative and faculty
time are engulfed by trying to negotiate or work around them.

We agree that an online system ultimately could be easier for the faculty to use. We recognize
that this is considered the case on some other campuses. However, given the structural
problems of the APBears system, it is clear that this system is not ready to be used or
implemented. It could be, but only with further study and correction. Thank you for your
consideration of this unwieldy and burdensome campus crisis.

2 The rough calculation: Given about 1200 Academic Senate FTEs, of which about 60% are tenured.
These undergo merit reviews every three years: untenured faculty every two years. This makes 480, or
for round figures 500 faculty who have to submit cases every year (we do not count adjuncts, who would
inflate the figures). Estimate that these 500 faculty take an EXTRA 20 hours each to deal with the AP
Bears system (an underestimate by anyone who is trying to work with this system). That gives us an
extra 10,000 hours of faculty time (not counting staff time) to deal with it. Estimate the median faculty
salary at Berkeley at $75,000 including benefits (probably a low estimate). That annual salary divided by
50 weeks is $1500 per week. Thus, for a 40-hour week faculty earn a mean of $37.50 per hour. 10,000
additional hours spent by faculty on AP Bears is an additional $375,000 dollars just this year. We think a
truer estimate could be twice that, even without including staff time.

Facebooktwittergoogle_plusredditpinterestlinkedinmailFacebooktwittergoogle_plusredditpinterestlinkedinmail

Comments are closed.